Can GFL SDK get accurate pixel color?

Discussions on GFL SDK, the graphic library for reading and writing graphic files

Moderators: XnTriq, helmut, xnview

Post Reply
Troberg
Posts: 1
Joined: Mon Oct 24, 2011 9:21 am

Can GFL SDK get accurate pixel color?

Post by Troberg »

I've got a problem that I've been fighting for quite some time, and I hope that GFL SDK just might be a solution. My problem:

I've written a program to find logical duplicates of images (ie, the same image, but it may have been resized, stamped or saved in a different format) in a very large collection of images (several millions). To do this, I analyze the image at certain pixels, which I then use to calculate a "finger print" of the image, which is then stored in a database. This fingerprint can be used to accurately find the dupes with a signal to noise ration of well above 100, ie, even similar images that are not dupes, such as adjacent stills from a slow movie scene, will still signal 100 times lower than a real dupe.

The problem is that when I run the program on another computer, I get another fingerprint. So, if my machine dies, I'll have to rescan the entire collection, a process which takes about 6 weeks of 100% CPU usage and heavy harddisk wear on a modern computer...

I've traced it down to Windows reporting not the real color of the pixel, but the pixel as it is rendered on the screen (even if it's just loaded into a memory context). That means that if you have a different graphics card or different settings on the same card, you'll get different colors. Now, this is the same regardless of which environment I've used for the development, C++, .Net, VB, so my conclusion so far is that this is a bug/feature of the Windows API.

So, of course, the question now is: Can GFL SDK give me the intended color of a pixel, not the color it's rendered on the screen?
Post Reply