Comparing QualysGuard PCI to Comodo HackerGuardian

I’m not doing e-commerce transactions, so its not required that I do quarterly external PCI Scan Compliance scans, as dictated by the Security Standards Counsel. But, for the fun of it, I ran an official scan, from two Approved Scanning Vendor (ASV) PCI programs. I am evaluating the QualysGuard PCI On-Demand program for something in my professional life, and using a part of my personal life (my blog), as the test case. This is great, I get to play with a new technology, discover and fix vulnerabilities on my web site, share my experience with you, help others identify and correct these problems on their servers, and get paid for it, all at the same time. Can I get a whoot?

I was shocked at how many problems it found with my web site. In this article, I will expose the report and discuss how I fixed each item, and therefore became PCI scan compliant. Then I will give my review of the QualysGuard PCI program.

Before I remediated any of the problems QualysGuard PCI found, I signed up for Comodo’s HackerGuardian program. I wanted to see if it found the same problems, or yet a different set. I was shocked that Comodo’s HackerGuardian simply gave me a PASSED scoring. More on that later in the article.

Read moreComparing QualysGuard PCI to Comodo HackerGuardian

My Experience @ MIX08

Yes, I know that Microsoft is the enemy…. or are they? I attended the MIX08 conference at the Venetian in Las Vegas, March 4-7, 2008. It changed my perspective of Microsoft. Now I know the difference between old Microsoft and new Microsoft. The new Microsoft has a new attitude towards open specifications. They’re working with Zend Technologies to make PHP work better in IIS7. They’ve opened the .NET framework specifications so the folks over at Mono can port .NET to Mac, Linux, Solaris and Unix. They’re opening the specifications for Silverlight so the same Mono group can deliver Moonlight. The Windows Live line: Live Search, Live Hotmail, Live Spaces, etc., all have open APIs that developers can use to create mashups in their own sites.

Read moreMy Experience @ MIX08

Deduplication Snapshots on Amazon S3

Deduplication is a term that refers to the practice of storing files by breaking up into chunks (or slices), getting a unique hash for each chunk, then storing the chunks and keeping metadata that explains how to reassemble the file later. This is useful in backup strategy, because you don’t have to back up the same file chunk twice. This is very useful when backing up multiple systems that contain the same, or similar files. Imagine that I backup files on one system, including common operating system files, and other common files. When I go to back them up on another machine, I don’t have to upload the file chunks on the second system, instead I’m merely storing the metadata about the file and how to reassemble it. Another good use is with files that grow. When I back it up a second time, rather than storing the same information, I only upload the new parts. A third use is taking snapshot backups of the same directory. If I take a full snapshot backup of a directory, the second time I take a snapshot of that same directory, I only upload the deltas. In other words, say I take a snapshot every day of a particular directory – instead of storing a full copy of mostly redundant data, I only save the new file chunks. The snapshot is a point in time map explaining which files existed and which chunks to use to reassemble each file.

The concept: Take each File -> break up into 5 MB chunks -> create a unique md4 hash of each chunk -> compare each hash to hashes already stored -> upload chunks that do not yet exist in the storage area -> save metadata so you can re-assemble the files later.

Read moreDeduplication Snapshots on Amazon S3