Classification Using Labview - IJERT Journal

2y ago
121 Views
2 Downloads
218.42 KB
7 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Adalynn Cowell
Transcription

Proceedings of International Conference “ICSEM’13”Image processing techniques for coinclassification using labviewBHAGIRATHI VMEGHANA M SINTHREDILIP RDepartment of MechatronicsAcharya institute of technologyBhagi.suryavanshe@gmail.comDepartment of MechatronicsAcharya institute of technologymeghs.sinthre@gmail.comDepartment of MechatronicsAcharya institute of Technologydilipr.me@gmail.comAbstract To test and validate the simulation.2. METHODOLOGYSoftware Used: LabVIEW 2010 with RT module, VisionAssistant 2010.Hardware used: NI 1742- Smart Camera.IJERTThe aim of the project was to develop a simulation fordetecting and classifying currency coins by machinevision using NI 1742 Smart Camera. Powered by a 533MHz PowerPC processor, it greatlyenhancesprocessing speed as it is a dedicated processorsimplifying machine vision by analyzing imagesdirectly. For programming the same, LabVIEW RT(Real- Time) software was used in conjunction withVision Assistant 2010. To construct a hardware setup mimicking the real worldproblem using NI-1742Smart Camera.Keywords: Lab View RT module, Vision assistant, NIcamera1. INTRODUCTIONCoin numeration can be done by finding out the weight orby using a vision system. We have used real time visionand image processing techniques for the identification ofdifferent coins. An extension of such a system wouldallow separation and counting of large number of coins ofdifferent denominations in places such as banks, temples,etc. This application can be easily extended to sort outcoins of different countries.As long as the background conditions can be controlledsufficiently, the coin detection task becomes almosttrivial. If the conveyor belt is homogenous and alwaysdarker (or brighter) than the coins, a simple thresholdoperation suffices for the separation. Once the currentcoin is separated from the background, the desiredfeatures can be extracted.Prime Objectives of the project were: To get acquainted with the LabVIEW 2010 softwareand its applications. To develop an algorithm for the specified problem andbuild a LabVIEW code for the same.3. WHAT IS LABVIEW?LabVIEW (shortforLaboratoryVirtualInstrumentation Engineering Workbench) is a platformand a graphical programming environment developed byNational Instruments (NI). It allows one to program withgraphical functional blocks (that can be dragged- and dropped) instead of writing lines of text. An importantaspect is Dataflow Representation which allows for easydevelopment and understanding of the code.LabVIEW is used worldwide to develop sophisticatedmeasurement, test, and control systems using intuitivegraphical icons and wires that resemble a flowchart. Itoffers consummate integration with thousands ofhardware devices and provides hundreds of built-inlibraries for advanced analysis and data visualization –all for creating virtual instrumentation (VI). It createsautomated forms of measurement and processinginstruments used in any typical laboratory setup. TheLabVIEW platform is scalable across multiple targetsand Operating Systems (OS), and, since its introductionin 1986, it has become an industry leader. [1]The latest version of LabVIEW is version LabVIEW2010, released in August 2010. LabVIEW Core I and IIcourses were taken at NI, Bangalore during 23rd-27thMay, 2011.356BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13”3.1 hardware integration with labviewA significant benefit of LabVIEW over otherdevelopment environments is the easy accessof instrumentation hardware through built inlibraries and thousands of instrument drivers.Drivers and abstraction layers for many differenttypes of instruments and buses are included or areavailable for inclusion. The provided driverinterfaces save program development time. Thus wecan write programs and deploy test solutions in areduced time frame when compared to moreconventional systems.3.2 real time module in lab viewA few of the functions employed in the present projectwere: Threshold Morphology Fill holes, Particle Filtering Particle Analysis-report Pattern Matching.Controls and Indicators that are to be programmaticallyused through LabVIEW can also be selected in case ofVision Assistant Express VI. Alternately, all the functionsof Vision Assistant can be selected individually forbuilding the algorithm directly in LabVIEW throughthe drag-and-drop functions available in Vision andMotion Functions Toolbar. It is to be noted that thesefunctions will be available in LabVIEW only afterVision Assistant is installed.4. NI 1742 SMART CAMERANI 17xx Smart Cameras simplify machine vision byanalyzing images directly on the camera with a powerful,embedded processor capable of running the entire suite ofNI vision algorithms. [2]A multifunctional vision system that can transmit not justraw acquired images but also the examined results, it’s anamalgamation of the onboard processor with a chargecoupled device (CCD) image sensor.IJERTThe National Instruments LabVIEW Real- Time Moduleis an add-on component for the LabVIEW DevelopmentSystem.express VI, Vision Assistant which can be used to create,edit and/or run vision algorithms using Vision Assistant2010. When this VI is placed in the Block Diagram,Vision Assistant is launched where we can buildalgorithms using the abundant functions offered forprocessing images.Fig 1: lab view real time moduleProgramming graphically in LabVIEW can greatlyimprove one’s programming efficiency, and this samegraphical approach can be used with the LabVIEW RealTime Module to create stand-alone systems that run forextended periods of time.While LabVIEW is commonly used to developapplications that run on desktop OSs such as Windows,Linux etc., these OSs are not optimized for runningcritical applications for an extended period of time. TheLabVIEW Real-Time Module features real-time OS(RTOS) software that runs on NI embedded hardwaredevices. Thus using LabVIEW, we can easily develop ordebug codes which can be then directly downloaded toand executed on embedded hardware devices such as NICompactRIO, NI Single-Board RIO, PXI, vision systems,or even third-party PCs made possible by usingLabVIEW RT. We have used it for programming NI 1742Smart Camera which is embedded with a 533 MHzPower PC processor.Housed in a rugged metal case, all NI Smart Camerasoffer built-in I/O, multiple industrial protocols, built-inWeb servers, and many other features. The NI SmartCamera was programmed with LabVIEW Real-Time.Real time Machine VisionHigh-quality monochrome VGA (640x480) CCDimage sensors with image acquisition rate up to60 fps.High-performance embedded processors – 533MHz processorIsolated 24 V digital I/ODual gigabit Ethernet ports – used for crossconnection with the operating computer RS232serial support.3.3 vision assistant 2010:It is an independent module but once installed, it createsadditional options in the functions toolbar of LabVIEWBlock Diagram - Vision and Motion. It provides an357BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13” Fig 2: NI 17xx Smart Cameras4.1 HARDWARE DETAILSWait Phase: The system waits for an input fromthe user. Learning Phase: The user gives training data forthe machine to learn a particular type of coinand its attributes/features. Classification Phase: The user gives randomcoins for classifying according to the typeslearnt in the learning phase.The user has an option of choosing from: Learning: training the system, Classification: classifying random samples Done: finishing the run5.2 theoretical backgroundIJERTEssentially, the problem of coin separation boils down tofeature based classification. It can be perceived as the actof taking in raw data and taking an action based on the“category” of the pattern/ feature. [3]Fig 3: hardware detailsVision Development Module – Vision Assistant 2010recognizes Smart Camera 1742 as soon as itisconnected via a cross cable to the computer. Butfor image acquisition programmatically throughLabVIEW, an IMAQ session has to be created usingIMAQ Init.vi. Subsequently, IMAQ Snap.vi or IMAQGrab Acquire.vi can be used for Snap (acquiring oneimage at a time) or Grab (Continuous acquisition ofimages) functions.5. SIMULATION SETUPFig 4: Process Flow Diagram5.1 labview code1. Data acquisition Images were acquired using IMAQSnap.vi in LabVIEW after initializing a session withIMAQ Init.vi and initializing an image with IMAQCreate. Unless the code is written to a VI added toProcyon (Smart Camera), there occurs a Camera interfacename conflict error. (Error -1074397163; Possiblereason(s): NI-IMAQ: The passed in interface or sessionis invalid.)The code must be deployed on the Camera, so a RealTime Project (.lvproj) was created from the GettingStarted Window. In the Browse Target option, we selectedSmart Camera from Select a new Target option. The codewas written on a VI that was already added to the SmartCamera, here Procyon (IP-192.168.1.2). The codecomprised of three stages:2. Pre-Processing Camera’s signals are pre- processed tosimplify the subsequent images without losing relevantinformation. This stage may include filtering for removalof noise, Image sharpening, color plane extraction: [RGB(red/blue/green) or HSL (Hue/ Saturation/ Luminance) orHSV (Hue/ Saturation/Value) or HSI (Hue/Saturation/Intensity)], segmentation (to isolate coins fromThe temporary setup was made by screwing the NI 1742Smart Camera to a wood board .The illumination was setto optimum limit by using a 25W electric bulb in serieswith a normal fan regulator. A temporary stand was usedfor adjusting the height in order to set proper focus.358BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13”the background), threshold etc. It is required to know thecoin area from the background in order to be able toextract features. Such a technique is called segmentation.Segmentation is to subdivide an image into its componentregions or objects. [4]Segmentation algorithms generally are based on one ofthe two basic properties of intensity values namelydiscontinuities and similarities. Discontinuity basedalgorithms partition an image based on sharp changes inintensity (such as edges) and the latter partition an imageinto regions that are similar according to a set ofpredefined criteria. Threshold (Fig 5) takes in bothdiscontinuity and similarity criteria.Since NI 1742 Smart Camera produces gray scaleimages, intensity plane extraction was not required as aseparate step. Threshold was directly applied to theimage taken by the camera.IJERT3.IMAQThreshold.viNI Vision Development Module.lvlib was employedfor this project with its Replace Value input equal to 1and Front Panel controls for the Range input. This wasdone to allow the user to set appropriate threshold rangelimits during learning phase. This information wouldthen be utilized by the feature extractor in theclassification stage. The output image after this stage is abinary image with only two regions, background and coinarea.This image was given as an input to Vision Assistant.viwhere a script containing the following functions waswritten.FIG 5: Grey Level histograms a) single threshold and b)multiple thresholds4. FEATURE EXTRACTIONA set of characteristic measurements (numerical or nonnumerical), and their relations are extracted to representpatterns for further process.[5] The task of featureextraction is problem and domain dependent and thusrequires knowledge of the domain.For this, it is important to look for distinguishingfeatures that are invariant to irrelevant transformationslike rotation, scaling, translation, occlusion, projectivedistortion, etc. Distinguishing features are those for whichfeature values are similar in the same category and verydifferent in different categories. For the present projectarea, perimeter and HCF were the features selected.Area represents the number of pixels lying within theregion identified as coin. Perimeter represents the numberof pixels lying on the boundary of this area. HeywoodCircularity Factor (HCF) is defined as the ratio of theparticle perimeter to the perimeter of a circle of the samearea. A perfect circle would have an HCF of 1. It is auseful tool for shape analysis.In the Vision Assistant.vi script, the last step was ParticleAnalysis where the above three parameters were selectedunder Select Measurements Option. The particlemeasurement report was stored in an array. During thelearning phase, these reports were all combined to form adatabase from where the comparisons were made in theclassification phase. The user is prompted to add a namefor each type of coin which was also added to theparticle’s attributes in the database.5. CLASSIFICATIONThe process or events with similar properties aregrouped into a class. The number of classes is taskdependent. [5] So it can be seen as the act of assigning anobject to a category by using the feature vector. Difficultyof classification depends on the variability of the featurevalues in the same category relative to the differencebetween feature values in different categories. Thevariability of feature values in the same category maycome from noise.Features Extracted from the sample coins of unknowndenomination were matched within tolerance limits of thedefault values in each class or coin type in the database.In the classification phase, all the steps starting from Advanced Morphology - Remove SmallObjects: This was done to remove anyunscrupulous particles that may have come inthe image.Advanced Morphology - Fill Holes: To coverthe entire coin area including the portionswhich may not have been identified by earlierstep of threshold due to imperfect thresholdrange suitability.Advanced Morphology - Separate Objects: Ifmore than one, coins are placed, they must beseparated.Particle filter: Removes or keeps particles froman image specified by the filtering criteria.Particle filtering was done with Heywoodcircularity factor (HCF). Particles having HCFbetween 0.8 and 1.2 were retained. This wasdone to reject other unwanted particles in theimage like dust particles etc.359BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13”IJERTimage acquisition to particle analysis report generationare the same as that in learning phase, as discussedabove. A classifier stage was designed which takes thisreport and compares each feature to the database createdin the learning phase. When a match for all the threefeatures is found, the classifier returns the type of the coinas entered by the user during learning phase.360BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13”6. FINDINGS AND CONCLUSIONSIn this project we have successfully been able to sortIndian currency coins of different denominations on aReal-Time embedded device. A sample image and theoutput images after every step of the image processingpart of the algorithm are shown below:organized by NI, India on June 28, 2011 at The Park,Kolkata.The system is not yet capable of sorting heaps of mixedcoins. But coins coming on a conveyer belt can beeasily classified using an extension of the program.Unknown coins are rejected. Further research may becarried out to improve the recognition result and speedREFERENCES[1] www.ni.com[2] NI 1742 Smart Camera Datasheet, 2008National Instruments.[3] Duda RO, Hart PE, Stork DG. Pattern Classification,2nd ed. New York: Wiley Inter science; 2000.[4] Gonzalez RC, Woods RE. Digital Image Processing.3rd ed. Addison-Wesley Pub (Sd);1992.[5] Qi X, Basic Components of Pattern Recognition andFeature Selection, REU Site Program in CVMA; 2011.IJERTBIOGRAPHY OF THE AUTHORSFig 12: Particle analysis reportThe particle analysis reports for each coin image weregenerated after all the preceding steps. These reports builtup the database in the learning phase. And forclassification phase, this report was searched for in thedatabase created. Accordingly, match/ no match werefound.A small live demo of the project was given by us at theSystem Integrators Business and Technical Meet 2011Bhagirath i V has receivedher M.Tech degree l Universityin the year 2007. She tituteoftechnology Bangalore. Herresearch interest is in thearea of Communicationsystem, signal conditioningand RF technology.Meghana M Sinthre hasreceived her M.Tech degreein VLSI and Embeddedsystem from VisvesvarayaTechnological Universityin the year 2011. She tituteoftechnology Bangalore. Herresearch interest is in thearea of Communicationsystem, signal conditioningand RF technology.361BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

Proceedings of International Conference “ICSEM’13”IJERTDILIP R has received hisM.E degree in Control andinstrumentationfromBangalore University in theyear 2012. He is tute of technologyBangalore. His researchinterest is in the area ofControl System, SignalConditioning and Processand Control Instrumentation362BHAGIRATHI V, MEGHANA M SINTHRE, DILIP RInternational Journal Of Engineering Research and Technology(IJERT), ICSEM-2013 Conference Proceedings

LabVIEW platform is scalable across multiple targets and Operating Systems (OS), and, since its introduction in 1986, it has become an industry leader. [1] The latest version of LabVIEW is version LabVIEW 2010, released in August 2010. LabVIEW Core I and II courses were taken

Related Documents:

Certified LabVIEW Architect Exam LabVIEW Core 1 LabVIEW Core 3 LabVIEW Core 2 Managing Software Engineering in LabVIEW Advanced Architectures in LabVIEW LabVIEW Connectivity Object-Oriented Design and Programming in LabVIEW LabVIEW Performance LabVIEW Real-Time 1

examples. So launch LabVIEW and explore the LabVIEW environment as you read this section. 1.3.1 Starting LabVIEW . If your version of LabVIEW was installed using the default installation procedure, launch LabVIEW by selecting All Programs National Instruments LabVIEW 2013 (or LabVIEW 2014) LabVIEW 2013 (or LabVIEW 2014) from the Start menu .

Labview Exercises for Labview 7.0 Installation of Labview: 1. Install the Labview 7.0 software and drivers onto your computer. These files can be found by mapping a network drive to \\poohbah\labview, and by running the ‗autorun‘ file in the ‗Labview 7‘ folder. The serial num

in LabVIEW Object-Oriented Design & Programming in LabVIEW LabVIEW Learning Path LabVIEW Core 1 LabVIEW Getting Started LabVIEW LabVIEW Core 3 Core 2. Student Guide x ni.com C.What You Need to Get Started Befor

Sound and Vibration Measurement Suite Sound and Vibration Toolkit LabVIEW Internet Toolkit LabVIEW Advanced Signal Processing Toolkit . LabVIEW Report Generation Toolkit for Microsoft Office LabVIEW Database Connectivity Toolkit LabVIEW DataFinder Toolkit LabVIEW S

of the LabVIEW Arduino interface. What this book covers Chapter 1, Welcome to LabVIEW and Arduino, introduces you to the Arduino platform and the LabVIEW software. Chapter 2, Getting Started with the LabVIEW Interface for Arduino, shows you how to install and use the LabVIEW interface for Arduino via the LINX module.

Actor-Oriented Design in LabVIEW LabVIEW NXG Options LabVIEW NXG Core 1 LabVIEW NXG Core 2 Transitioning to LabVIEW NXG Proficiency Events LabVIEW Developer Days CLD Summit . This learning path is for users developing embedded control and monitoring systems to design smart machines or industrial equipment. It presents courses, exams, and .

using artificial intelligence (AI)—and machine learning in particular—increasingly match or surpass human-level performance; news about the rapid pace of technological advancement abounds, and market capitalizations for technology firms are at all-time highs. Yet, measured productivity growth in the United States has declined by half over the past decade, and real income has stagnated .