Friday, October 31, 2014

Project update 3

My synthesis project of knowing the Gamut displays and plotting in CIE xy chromaticity space has come to a halting point. What to do? Let me enumerate my problems.

My power distribution versus wavelength, also known as spectroscopy data of my laptop display and Kindle display has come to a bump in the road. I need them to solve for the vertices of my Gamut polygon. 

Ma'am Jing suggested I use interpolation to reduce my data and I did. It worked. When I looked at my data, and used the function isnan() in Scilab, there were slots where it is true. 

So how can I reduce this NAN (which means not a number) in my data? Should I reduce my spectroscopy data using Excel? Or should I use the other interpolation methods like 'linear', 'spline', since I used 'nearest' in the interp1() function in Scilab.

Why should I use these methods? Will they yield fruitful results? I tried 'linear' and yes, there was an error: "Grid abscissae of dim 2 not in strict increasing order."

I think the 'nearest' method of interpolation yielded great results but it cannot plot the spectroscopy of the BLUE screen of my laptop display.

Here is the plot. The color of the graph corresponds to the red screen spectroscopy data and the green screen spectroscopy data.


I'm getting frustrated since I have to do the Kindle spectroscopy data, but once I solve this problem I can start my paper on this project.

Here is the color matching functions (left) and spectroscopy data of my laptop display (right):
It may look like they have the same data points but no, the spectroscopy data has a 1240x1 matrix for the red and blue screens, while 1238x1 matrix for the green screen. The color matching functions only have 471x1 matrix.
You can see from the data that the spectroscopy data and color matching functions are similar. Why? Because the color matching functions are like the sensitivity of the eye to color and they are based on standard observer data. 

Tuesday, October 28, 2014

Project update 2

I superimposed the CIE xy diagram from Wikipedia.org

CIE xy chromaticity diagram from http://upload.wikimedia.org/wikipedia/commons/thumb/3/3b/CIE1931xy_blank.svg/450px-CIE1931xy_blank.svg.png

Plotted CIE xy chromaticity diagram superimposed with image from above.



Got problems with doing the Gamut polygon.

So I asked Ma'am Jing what to do so that the spectroscopy data I got has the same matrix size as the color matching functions of the CIE xy chromaticity diagram.

Sunday, October 19, 2014

Project update 1

My project in AP 186 knowing how good is the color display if you use quantum dots as your light-emitting devices instead of light-emitting diodes, or other devices.

How can I know how good is a color display? This is when the CIE xy chromaticity coordinates come through.

So first, how does the eye perceive color? We know that our color primaries (red, green, blue) can form a color and a mixture of this is a color pixel in a color display like a TV. The CIE color matching functions are like the spectral sensitivity of the eye to the color primaries. It is because they are derived from a human observer.

To do this I need to plot an CIE xy chromaticity diagram.
So I used these equations:

where K is a normalizing function, P is the power distribution of some object Q, and X,Y,Z are the color matching functions that I downloaded from the web. The power distribution that I used is the dirac delta function for every monochromatic wavelength from 380 nm to 780 nm with increments of 5nm. This is so that I can form the bounds of the CIE xy chromaticity diagram. [1]

Now you can get the CIE xy chromaticity coordinates using these equations[1]:



and 


I already have plotted the CIE color matching functions, and using Scilab and it is shown below.


And I plottted the CIE xy chromaticity diagram using equations 1-5.



Figure 1. Computed CIE xy tongue.

Figure 2. Reference CIE xy tongue from Wikipedia.org
Now I need to do the measuring the Gamut of color displays, so I can see if quantum dots as color display is better than an LED display.


References:
[1]J.Soriano, AP 186 manual - CIE xy Chromaticity Diagrams 2010, 2014.

[2]CIE 1931 color space, http://en.wikipedia.org/wiki/CIE_1931_color_space

For CIE xy tongue comparison map:
[3] http://upload.wikimedia.org/wikipedia/commons/thumb/3/3b/CIE1931xy_blank.svg/450px-CIE1931xy_blank.svg.png

For colloidal quantum dots:
[4] http://www.nano-reviews.net/index.php/nano/article/view/5202/5767#F0003

For the CIE's color matching functions:
[5] http://www.cvrl.org/cmfs.htm

Wednesday, October 1, 2014

AP 186 Activity 8 - Morphological operations part 2

Now we will use the applications of what we did in the post of Morphological operations part 1.

So we take an image of punched circles which are scanned in a flatbed scanner.


Now, I have to divide this images into 256x256 pixel subimages, with their filename in an increasing number, I named them C_01.jpg and so on. The first subimage is shown:


Now we made a histogram of this image so that we will know its threshold. Knowing this threshold will help us segment the image. The histogram is shown below.
And the threshold that I got is 211.83, and I used this value to SegmentByThreshold() function in Scilab. This is the result:

Now we have to clean the image, and we have three choices of morphological operators: CloseImage, OpenImage, and TopHat that are available in the IPD toolbox. I chose OpenImage and it is defined as "This function applies a morphological opening filter to an image. This filter retains dark objects and removes light objects the structuring element does not fit in. " which is suitable for the image. Now we need a structuring element and I used CreateStructureElement() function of a circle of size 11. The cleaned image is now:

Next we have to remove those circles which are overlapped. To do that we have to label each contiguous blob using SearchBlobs. Each connected blob will be replaced with a number. Then I have to filter them by size. We don't know what the sizes are so what I did was I marked all blobs per subimage then plotted the histogram of the sizes of the blobs from all the subimages.

From the histogram, I ignored the zero interval then chose the interval where it peaks. I chose the interval from 400 to 550 since that is where the peaks are. I used this interval to FilterBySize() function to separate the circles/cells which are overlapped. Then, I used a colormap to distinguish them uniquely by color.

I used these steps for all the subimages. I calculated the mean for each subimage and averaged these mean values. The standard deviation of the means was calculated from the averaged mean values.

Averaged mean: 532.10606
Standard deviation: 107.15587 

The standard deviation is used as the uncertainty and so the limits I will use for the size of a normal cell will be 532.10606 ± 107.15587 pixel area.

So now we proceed to isolate the abnormal sized cells using the knowledge of the best value of size of normal cell. We have an image of normal cells with cancer cells:
We have to use the steps I enumerated earlier:

1) Segment the image to emphasize cells.
2) Use morphological operation OpenImage() to clean the image with structuring element circle of size 13 since the SE earlier gives out overlapped blobs even after filtering it.
3) Uniquely mark the blobs. 
4) Use FilterBySize function to filter out overlapped cells. 

We will use the pixel area obtained from the subimages part. We will use the 532.10606 + 107.15587 pixel value as the lower limit of FilterBySize. This will store the blobs/cells with area higher than 532.10606 + 107.15587. It will not store the blobs lower than this value.

Here is the filtered circles with cancer image that are marked. 

Then, the inverted filtered image was convoluted with the original cancer image so that it will output an image that looks like the original image but is marked. The result is shown below:

The abnormal/cancer cells are marked and you can see that they are really bigger than the others.

I give myself a score of 10/10 since I completed this activity and did all that is required.

I had a difficult time with the cancer cells because it was not displaying the output I wanted. I had to change the structuring element so that it will not display the overlapped circles.

References:
[1] M. Soriano, AP 186 manual, A8 - Morphological operations, 2014.