Color machine vision has its challenges. Systems can produce three times the data (or less than one-third the resolution) of a monochrome camera solution. Color can introduce more potential sources for imaging errors, more complexity, more cost, and require careful engineering that reduces the system’s flexibility to deal with lines that make products of varying shape, colors, and size. In fact, if designers can find a way to use filters and lighting to measure a colored area using monochrome cameras, they usually do. However, for many applications ranging from electronic manufacturing to printing and food processing, color is the only way to solve the problem. Let’s look at some of the considerations a system design needs to take into account to create a successful color machine vision solution, including careful matching of camera, optics, and light source.
The ability to operate and manage operations in a location-agnostic manner opens the door to a wealth of opportunities. For instance, experts and operations staff can be relocated to population centers, and out of harms’ way. They can then be leveraged over multiple assets in real-time to ensure maximum utilization. Networking collaboration also allows for much faster creation and utilization of best practices across a network of operating assets, thereby contributing to better knowledge retention and management as well as greater efficiency, and establishing a true, shared corporate culture throughout the enterprise.
The Situation: A leading global producer of crude oil and natural gaslooked for a way to stay ahead of dynamic market demands and overcome challenges associated with offshore oil and gas Automation. As part of an innovative technology project and with the help of Honeywell, this company built a Solutions to help coordinate control of multiple offshore platforms in the North Sea, and improve operations and efficiency.
With the new CCR, this company has centralized operations at 18 of its 26 offshore platforms. All operating and production procedures are fully automated and synchronized, creating increased flexibility and competitive advantage. At the heart of CCR is Honeywell’s Experion Process Knowledge Management System (PKS), which enables operators to monitor and control production at various platforms.
Resurgence of the Do It Yourself (DIY) community has driven a range of open networking platforms, giving aspiring technologists cheap and easy access to embedded development. Outside of hobbyist toys and educational devices, however, “hacker” boards are increasing performance and I/O flexibility, and have become viable options for professional product development.
The “maker” movements of the past few years quickly gained traction in the education and hobbyist markets, as organizations began producing open hardware boards with a “less-is-more” architecture at a price to match. DIY boards like the Arduino, BeagleBoard, and Raspberry Pi provide “known state” programming platforms that allow easy exploring for novice developers, and enough flexibility for advanced hackers to create some pretty remarkable things – which they have solutions.
Now, Kickstarter projects like Ninja Blocks are shipping Internet of Things (IoT) devices based on the BeagleBone (see this article’s lead-in photo), and startup GEEKROO is developing a Mini-ITX carrier board that will turn the Raspberry Pi into the equivalent of a PC. Outside of the low barrier to market entry presented by these low-cost development platforms, maker boards are being implemented in commercial products because their wide I/O expansion capabilities make them applicable for virtually any application, from robotics and industrial control to automotive and home automationsystems. As organizations keep enhancing these board architectures, and more hardware vendors enter the DIY market, the viability of maker platforms for professional product development will continue to increase.
With that said, the solutions is going to be moving with an industry that has a definite consumer bias, with product development and release embedded systems of six months or less. In an industry where the average life expectancy of an automotive production line is eight years, it is impossible to expect the networks in an industrial setting to keep up with modern IT standards. Therefore, we turn our attention to the technologies that have existed the longest, with the most open standards and the very best support. These are the protocols we wish to use and keep, and this article highlights and explains some of these technologies.
This article does not focus on the technical implementations of each piece of technology. Rather, it is assumed the reader will be using packaged solutions such as a function block for a PLC. These packages typically require only that the user specifies the relevant server to connect to, the data to be gathered and an activation bit. The particulars of each protocol and concept are, ideally, transparent to the user, and therefore it is not pressing that the user understands what is contained in each packet passed between the server and the client. As each protocol described in this article is openly documented and supported, a simple search on the Internet for the technical details will likely yield the relevant implementation details.
refer to: http://www.automation.com/leveraging-it-technology-for-industrial-controls-applications
With the increasing availability and associated complexity of a wide variety of 32-bit microcontrollers and microprocessors, the possibilities for embedded product designs are exploding. Leveraging a myriad of embedded computer and integrating advanced graphical user interfaces and multimedia formats requires the availability of supporting software stacks from the underlying operating system. And, more than ever before, embedded software teams are turning to open source software and embedded Linux as the platform on which to base these systems in the “Internet of Things.” But while open source has proved itself incredibly technology enabling, it can also make the workflow excessively unwieldy. The good news is that solutions and best practices exist to help development teams improve their software development workflow when open source is an increasingly large part of the mix.
refer to: http://embedded-computing.com/articles/the-not-code-quality/
With the first car makers committing to the MOST150 network in selected vehicles from 2011. The new Intelligent Network Interface Controller (INIC) In-Vehicle computers architecture complies with Specification Rev. 3.0 and expands the audio/video capability for next generation automotive infotainment devices such as Head Units, Rear Seat Entertainment, Amplifiers, TV-Tuners and Video Displays. The MOST Cooperation – the organization through which the leading automotive multimedia network Media Oriented Systems Transport (MOST) is standardized – proudly announces that the newest Specification Rev. 3.0 is on its way to production. Various In-Vehicle computers have already started with first series projects implementing this latest MOST Technology. MOST150 enables the use of a higher bandwidth of 150 Mbps, an isochronous transport mechanism to support extensive video applications, and an embedded Ethernet channel for efficient transport of IP-based packet data. It succeeds in providing significant speed enhancements and breakthroughs while keeping costs down.
refer to: http://embedded-computing.com/news/most150-series-adoption/
Various car makers have already started with first series projects implementing this latest MOST Technology. In-Vehicle computers enables the use of a higher bandwidth of 150 Mbps, an isochronous transport mechanism to support extensive video applications, and an embedded Ethernet channel for efficient transport of IP-based packet data. It succeeds in providing significant speed enhancements and breakthroughs while keeping costs down. The MOST Cooperation – the organization through which the leading automotive multimedia network Media Oriented Systems Transport (MOST) is standardized – proudly announces that the newest Specification Rev. 3.0 is on its way to production. In-Vehicle computers and the new Intelligent Network Interface Controller (INIC) architecture complies with Specification Rev. 3.0 and expands the audio/video capability for next generation automotive infotainment devices such as Head Units, Rear Seat Entertainment, Amplifiers, TV-Tuners and Video Displays.
refer to: http://embedded-computing.com/news/most150-series-adoption/
How are these technical problems best solved, by industry and the EEMBC?
refer to: http://embedded-computing.com/articles/moving-qa-markus-levy-founder-president-eembc/
The 4th generation Intel® Core™ processors
The 4th generation Intel® Core™ processors serve the embedded computing space with a new microarchitecture which Kontron will implement on a broad range of embedded computing platforms. Based on the 22 nm Intel® 3D processor technology already used in the predecessor generation, the processors, formerly codenamed ‘Haswell’, have experienced a performance increase which will doubtlessly benefit applications. Beside a 15% increased CPU performance especially the graphics has improved by its doubled performance in comparison to solutions based on the previous generation processors. At the same time, the thermal footprint has remained practically the same or has even shrunk.
These improvements and the high scalability from cost-optimized Celeron® versions up to high-end Intel® Core™ i7 and Xeon® processors make the new Intel® Core™ microarchitecture a perfect match for nearly each and every mid-range to high-end embedded applications. In a first step Kontron has implemented the new microarchitecture on COM Express®, Mini-ITX, 6U CompactPCI®, and the Kontron SYMKLOUD Media cloud platforms with further platforms to follow. So, in what way can embedded appliances benefit from these improvements?
refer to: http://embedded-computing.com/white-papers/white-intelr-coretm-processors/
The ability to transition between x86 and ARM embedded computer processors is critical for low-volume medical applications because a single carrier board – often the most costly component of a COM architecture – can suit the needs of both graphics-intensive systems and platforms that require more mobility and lower power. In addition to reducing Time-To-Market (TTM), this decreases Bill Of Materials (BOM) costs and eases Board Support Package (BSP) implementation, says Christoph Budelmann, General Manager, Budelmann Elektronik GmbH in Münster, Germany (www.budelmann-elektronik.com).
“Scalability is a key factor, especially for lower embedded computer volumes, and the Qseven standard offers the possibility to use the same baseboard with different processors depending on the user’s needs,” Budelmann says. “Some users only need a small control unit and prefer a simple ARM processor, whereas other customers want to implement large screens and need the graphical power of an x86 system. Of course, this can also be the case in medical applications. Even if the baseboard has to be adapted to very special demands, this is less complex than switching from a pure ARM platform to an x86 platform or vice versa. In the majority of cases, only some drivers, such as Ethernet PHY, have to be exchanged whereas the real application software can remain the same.”
refer to: http://smallformfactors.com/articles/qseven-coms-healthcare-mobile/