Color Measurement of Segmented Printed Fabric Patterns in Lab Color Space from RGB Digital Images

Colors of textile materials are the first parameter of quality evaluated by consumers and a key component considered in selecting printed fabric. In the textiles industry, digital printed fabric analysis is one of the basic elements in successfully utilizing a color mechanism scheme and objectively evaluating fabric color alterations. Precise color measurement, however, is mostly used in sample analysis and quality inspection which help to produce reproducible or similar product. It is important that for quality inspection, the color of the product should be measured as a necessary requirement of quality control whether the product is to be accepted or not. Presented in this study is an unsupervised segmentation of printed fabrics patterns using mean shift algorithm and color measurements over the segmented regions of printed fabric patterns. The results established a consistent and reliable color measurement of multiple color patterns and appearance with the established range without any interactions.


Introduction
Color conceivably is one of the most significant features of textile materials.It is one of the basic elements considered in textiles production, garment industries and decorative application.It is however essential to attest that textile materials and clothing are of suitable color, according to the designer's idea and fashion trends [1].Reproducing and measurement of multiple fabric colors especially with intricate repeat patterns in textile materials is still a challenge to textile printing and garment industries.It has been noted that environmental influences like the weather, artificial light, laundering, ironing, body perspiration and others are connected to drastic color stability declining in textile products [1] [2].These factors may be based on some parameters (color measurement) that are ignored or given less attention during and after the production process.The determination and color measurement of printed fabric patterns is not only vital from the aesthetic point of view but also in determining any change that may arise and as well indicate an adjustment in some of its appearances that could lead to a desirable quality control in printed fabrics.[3] [4] identified that colors are classically assessed visually in the apparel industry which yielded little result due to variability of daylight and individual perceptions.Lau et al. have innovated checking cabinets and light booths engaging standard illuminants so that samples could be viewed under invariable conditions while assessing colors against the standard [3] [5].The result presented difficulty in high dispersion due to diverse opinions of operators/administrators rather than facts which should have been recorded automatically without human intrusion.
The above attempt led to several researches and methodologies, which effectively came to light in addressing successful color measurement of printed textiles/fabrics.Bugao Xu and Sheng Lin [6] developed a hybrid method of self-organizing map (SOM) and fuzzy c-means clustering to automatically identify multiple colors of printed fabric.Their method converts a color image to a planar density map that indicates the pixel counts of each major color cluster.
The method successfully results in an objective separation of color regions in an image and enables color evaluations on an individual basis.Despite the success of this method, it was considered principally for determining the number of major color/clusters in the image and the average color value of each cluster.
In a related development Xu [7], performed Evaluation of Color Alterations on Fabrics by Image Analysis.This work proved that image analysis (IA) can precisely locate, segment, and analyze fabric areas that may be affected by crocking and scuffs or that have dye or color flaws.This work shows that only color changes on fabric are assessed and also unlike colorimetry, the (IA) system is able to consistently characterize discolored areas of various sizes, shapes, and uniformity in an objective and quantitative way.
Lou et al. [8] correspondingly designed a multispectral imaging approach to measure color and match single yarns without winding where a single yarn is segmented from a background in multispectral images by modified k-means clustering method.In this work, multispectral imaging reflectance of the single yarn is specified by an averaging method and imaging system, namely imaging color measurement (ICM) to evaluate their proposed method.Even though the approach was successful, it was only targeted at measuring the color of individual single yarn strands before they are made into fabric.Lau et al. [5] stated that there is a robust correlation between colors coordinates from the spectrophotometer and DigiEye for fabrics.
Even though this work is not based on DigiEye as a measuring instrument component, Lau assertion cannot entirely be accepted because DigiEye is suitable for measuring larger and small areas of printed fabric patterns with multiple colors and intricate patterns which exceed the measurement area of the spectrophotometer that is conventionally used for textiles (e.g. 30 mm in diameter) [1].
Neda et al. conducted a survey on two commercial spectrophotometers with different measuring geometries (GretagMacbeth Eye-One Pro) to scrutinize the measurement ambiguity in color classification of textile products [9].This work comprises two categories: precision and accuracy.The results of their work were determined by the level of repeatability and the level of reproducibility.Repeatability in their case, relate to the discrepancy between readings of the same statistics repeated by the same instrument consistently.
It is then quantified by calculating Mean Color Difference from the Mean of the average color differences.On the other hand, reproducibility shows the changes between assessments of the same results in view of different measuring instruments.Identical design explains that the result obtained from reproducibility is considered inter-instrument agreement and inter-model agreement if the results of reproducibility demonstrated different design of measurement by the instruments.Although this method presented some good results, the design parameters are limited to measurement of uncertainty of geometrical shapes of textile fabrics [10].
In view of the above technical hitches of color measurement of textiles materials especially for printed fabrics, we employed a new method where printed fabric color patterns are segmented with mean shift algorithm and subsequently measured mathematically by converting the RGB images to Lab color space.The RGB images of fabric patterns were captured by a computer controlled DigiEye system where repeatable images are captured with high quality.Color space expresses color as three numerical values, L * for the lightness and a * and b * for the green to red and blue to yellow color components.CIELAB (Commission Internationale de l'eclairage) was considered to be perceptually uniform with respect to human color vision, meaning that the similar quantity of numerical transformation in these values match to about the same quantity of visually perceived adjustment.CIE is the universal and commonly standardized color space that is able to perform mathematical conversion.
It is also important to note that digital cameras were not designed as scientific measuring instruments rather for making peculiar images look good.For this and other reasons, our study aims at promoting effective quality inspection control in the textile printing industry by helping to resolve the complications involved with measurement of printed fabric color patterns.The color measurement of segmented printed fabric patterns will subsequently be used in the sample analysis and quality inspection where the sample will be used as process parameter to produce same or similar product.This will help in quality inspection to determine inconsistencies with respect to standards in authenticating the cause of irregularity if any [11].The rest of the paper is structured as follows: objectives for the research, materials and methods adopted for the study, detailed results discussed and conclusion.

Materials and Methods
The printed fabrics are plain cotton woven patterns.They were washed, ironed and captured using the DigiEye System (Great Britain) shown in Figure 1 the original pattern values, where the points of union, and a set of labels [12] [13]: run the mean shift process for xi and store the convergence point in zi • The RGB color space Red, Green and Blue are the primary color space component based on the RGB color model or coordinate that is widely used through textiles.A specific RGB color space is characterized by the three coordinates corresponding to their additive primary which is liable to produce any form of desired color from well demonstrated primary colors [14] [15].Figure 2 indicates diagonal cube of each primary colors components.

A. Lab color space
Technically, color space is usually mapped onto a three-dimensional digit space for digital representation which are the L * , a * , and b * values completed with a pre-defined range.In Figure 3  and b * = 0.The a * axis represents the green to red module with green in the negative direction and red in the positive direction.The b * axis represents the blue to yellow component with blue in the negative direction and yellow in the positive direction [16] [17].Scaling and limit of the a * and b * axes will be subject to a specific application [18].Nevertheless, they are often computed in the range of ±100 or −128 to +127 (signed 8-bit integer).
It is demonstrated mathematically in the intervals as: where [ ] is the parameter vector for model f.When f is linear, a direct linear regression method is used for the parameters.On the contrary, for non-linear functions it is important to use iterative approaches such that fminsearch function will be used to search for the minimum of the target function based on a gradient method.
The methodology used for converting RGB to L * a * b * consists of two parts.
where M is the transformational matrix • The second step transforms the XYZ to L * a * b * : where , , To determine the least error for this experiment, average Root Mean Square Error (RMSE L , RMSE a , RMSE b ) was calculated between segmented and un-segmented measured values using the following equation:

Image Segmentation
According to Gonzalez, image segmentation is the partitioning of a digital image into multiple segments where pixels in a region share similar characteristics such as color, intensity or texture [19] [20].In this study, our ultimate segmentation objective was based on application criteria where segmentation process was considered a pre-processing method.Therefore, the process was not dependent on prior knowledge of printed fabric patterns image specific parameter adjustment.
The aim here was to achieve a well-defined color region pattern for meaningful further application.Zaitoun detailed that image subdivision is an important procedure of image processing to image analysis; segmentation is the target expression and has imperative result on the feature measurement that is responsible for high-level image analysis to be understood [21].The figures below show well-defined component patterns, the number of clusters and their segmentation processes.Figure 4 and Figure 5 which are composed of two clusters were segmented with enhanced parameter 0.4.
Figure 6 has three clusters and was segmented with a parameter 0.5.Figure 7 and Figure 8 were also segmented with parameter 0.19. Figure 7 and Figure 8 have four clusters.Figure 9 has five clusters and was segmented with parameter 0.20.The parameter in each case was adjusted to improve segmentation results.
This basic procedure is aimed at reducing error in measurement evaluation.
The mean shift segmentation employed for this study defines arbitrarily shaped regions by locating the modes in the density distribution space and grouped all pixels associated with the same mode.The segmentation was carried out with parameter tweaking where bandwidth were enhanced or adjusted to suit the number of clusters of printed fabric patterns.These parameters with mean shift algorithm appropriately segmented the selected patterns for this work: 0.4 for two colors fabric, 0.5 for three colors, 0.19 for four colors and 0.20 for five colors respectively.

Color Measurement
Color measurement of printed fabric as a quality inspection is necessary because      The minimum values for R 2 (R-Square) and RMSE (root mean square error) from estimated L * a * b * values, determine the efficiency of our method in this work.We compared segmented measured R 2 (R-Square) and RMSE (root mean square error) values to the corresponding un-segmented measured R 2 and RMSE values.
In Figure 10, two colored printed fabrics measurements were normalized at function (x, y) for mean and standard deviation where R 2 is recorded for segmented measured pattern at 0.6295 against 0.6541 and RMSE calculated at 3.245 against 4.2955.L * a * b * figures (segmented and un-segmented) represent a three dimensional real number unit that reflect the colors in the printed pattern showing independent of how each color is produced which showed lab color space more close to the human eye.The L * , a * and b * units define complete color space which does not rely on any form of interaction through input or output device systems but define colors of printed pattern more accurately when segmented.
Segmented and un-segmented patterns in Figure 11 and Figure 12 show   minimal, non-significant variation in lightness in Figure 11.Although the means and standard deviations (STDs) are not significantly different, the L * plot demonstrates distributional variation between areas with a higher peak density in both cases and exhibit much less disparity over them in terms of a * and b * color axes compared to the L * axis.Mean shift algorithm is equally able to segment the two figures (Figure 11 and Figure 12) with same adjusted bandwidth.b * established good correlation on the axes indicating a trend in surface colors ranging from less red across the axes to blue yellow.The L * axis has less yellow to lightness that is not so prominent.The plot of L * and b * shows more blue surface colors with less white, more white and yellow, and also more white and less yellow unlike un-segmented values, where a * and b * illustrate less red and dominant blue across the axes with less yellow along the L * axis.R 2 values are 0.827, 0.9788 for segmented measured values and un-segmented measured values while RMSE is recorded at 1.688 and 2.4891 respectively.

C. Kumah et al.
Colors in Figure 14 are more evenly distributed due to uniformity in colors and homogeneity in the pattern.Segmented and un-segmented patterns are fairly plotted with less red along a * and b * axes that diminishes to blue with more lightness on the L * axis and dominant yellow.Comparing the result with other samples in the study, it was identified that homogeneity of the printed fabric color is a significant parameter affecting measuring of printed textiles.Also, variations recurring in the measurement process are due to dark and light tones.R 2 for segmented measured pattern is 0.8481 while un-segmented measured pattern is recorded at 0.9794 and their respective RMSE are 1.8345 and 2.4080 showing that segmented printed presents less error than un-segmented printed pattern.Figure 15 with five colors, present considerable improvement in R 2 result at 0.4523 for segmented measured pattern and 0.4732 for un-segmented measured pattern whereas the RMSE is recorded at 3.0379 and 3.4639 respectively.
The increased error (RMSE) reveals the intricate patterns in printed fabric that are homogenous and fairly uniform in color, shape and could be categorized by their colors and as well be used to classify defects from non-defects in printed fabric pattern.
In Table 1 and Table 2,    color evaluation.The segmented patterns unlike un-segmented patterns provide dimensionality more adaptable to mathematical transformation that attempt to correct systematic errors which also helps to make the processing faster in this study.
The method used in this work was based on linear model that converts RGB  L * a * b * color space.The results of this study showed that digital image processing XYZ techniques: image segmentation, employed prior to color measurement transformation help in manipulation of the digital images by observing the objects that are not visible, creating better image information retrieval of image and as well distinguishing the region of interest (ROI) in the image for better color evaluation.The segmented patterns unlike un-segmented patterns provide dimensionality more adaptable to mathematical transformation that attempt to correct systematic errors which also helps to make the processing faster in this study.
The Lab coordinates describes channel L * to be equal to 0 and L * equal to 100 indicating black and lightness respectively.Color channel a * negative (−a * ) and coordinates are independent of the color space that have been converted due to the transformation of X and Y which originate from RGB color space indicating the red-green and yellow-blue which appear to be opposite channels computed as dissimilarities of lightness transformations of patterns [23].
CIELAB is chromatic color space values that imitate nonlinear response to the human eye [24].Consequently, it was noted that the uniform variations of L * a * b * color space components relatively correspond to the uniform variations in perceived colors from all the segmented measured values.Hence, this study reveals that segmented and un-segmented values of L * , a * and b * were within range even though un-segmented measured patterns registered more RMSE than segmented measured patterns.This assertion proves the effectiveness of mathematical color transformation employed in measuring printed fabric patterns in this study.
Absolutely, uniform perceptual transformations between any two colors in L * a * b * color space is estimated by considering each color as a point in a three dimensional space.

Conclusions
In this study, two all-purpose computer vision systems namely: image segmentation based on mean shift algorithm and L * a * b * color space transformation from RGB unit were employed for selected digital printed fabric patterns.Vital information extracted from original printed patterns using mean shift algorithm and then authenticated mathematically by RGB to XYZ to L * a * b * .Comprehensive results of segmented measured values presented compared to un-segmented measured values show that the method adopted is able to effectively validate homogeneity and uniformity in colors of printed fabric.For quality inspection control, it is necessary to have assessment of production parameters including color of printed fabric aiming at producing same or similar product.CIELAB color space needs to be evaluated to determine the perceptual uniformity between sample and original since it is authenticated to correspond to equal color differences perceived by humans.This could be a desirable approach for successful market due to the fact that audience perceptions and reactions are paramount to the growth of textiles industries.
Future research will be focused on other mathematical transformations and compared to digital devices to determine appropriate and efficient method of measuring segmented printed fabric color patterns, thus minimizing the root mean square error with little or no human interaction
by connecting together all which are closer than 0, 5 from each other in the joint domain.eliminate spatial regions that are smaller than M pixels.
The mathematical transformation of color space was estimated using the model parameters by expressing the equation in the following below.Let f be the function which transforms the coordinate (RGB) in (L * a * b * ): are the values of the reference for white and ij M are the elements of a conversion matrix M between the spaces RGB and XYZ.To reliably implement this conversion, a function f, as shown in Equation (1) is defined from Equation (3) and Equation (4).This function is received as elements parameters of the conversion matrix M, as well as the RGB and L * a * b * data C. Kumah et al. from the sample obtained from the image acquisition system.The normalized mean error in the estimate of each of the L * a * b * variables were obtained by comparing segmented measured values (L * a * b * ) with un-segmented estimated values (L ^*a ^*b ^*): *, a ^* and b ^* are the values of un-segmented measured values, and n is the number of measurements.L * values ranges from 0 to 100, and a * and b * values are between −128 to +127.The evaluation of the performance of this method was done by calculating the mean error using the equation:
some common attributes.The normalized mean values for segmented L * , a * and b * for Figure 11 f(y) = −14.38,un-segmented L * a * b * for Figure 11 f(y) = −14.13and segmented L * a * b * for Figure 12 f(y) = −18.76and, un-segmented L * a * b * for Figure 12 f(y) = −18.91.These reflect that pixels in those regions are similar with respect to some characteristic or computed property such as color, intensity or texture even though they have different number of clusters/colors (2 colors for Figure 11 and 3 colors for Figure 12).R 2 for segmented pattern is 0.9903 against un-segmented pattern 0.9911 and RMSE 1.2559 and 1.3200 respectively.Segmented and un-segmented patterns in both figures display close results due to the characteristics that are presented in the patterns.The L * values show

Figure 10 .
Figure 10.The relationship between segmented and un-segmented measured values of pattern (A).

Figure 11 .
Figure 11.The relationship between segmented and un-segmented measured values of pattern (B).

Figure 13
Figure 13  validates possible relationships between all the color axes.The a * and

Figure 13 .
Figure 13.The relationship between segmented and un-segmented measured values of pattern (D).
Figure 14.The relationship between segmented and un-segmented measured values of pattern (E).

Figure 15 .
Figure 15.The relationship between segmented and un-segmented measured values of (F).
positive (+a * ) indicate green and magenta in the range of −128 and +127 respectively.The position yellow; +127 indicates b * positive (+b * ) values while blue; −128 shows b * negative (−b * ) values.The feasible range of channels a * and b *

Table 1 .
Normalized f(x, y) of mean, standard deviation and R 2 and root mean square error of segmented measured printed color patterns.

Table 2 .
Normalized f(x, y) of mean, standard deviation and R 2 and root mean square error of un-segmented measured printed color patterns.