<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Edge Computing | Future Markets Magazine – FMM</title>
	<atom:link href="https://future-markets-magazine.com/en/markets-and-technology-edge-computing-en/feed/" rel="self" type="application/rss+xml" />
	<link>https://future-markets-magazine.com/en/markets-and-technology-edge-computing-en/</link>
	<description></description>
	<lastBuildDate>Wed, 16 Jun 2021 14:11:05 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.6.2</generator>

 
	<item>
		<title>Multiple fisheye-stereo cameras</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/multiple-fisheye-stereo-cameras/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Fri, 25 Dec 2020 17:00:56 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[artifical intelligence]]></category>
		<category><![CDATA[automotive system architecture]]></category>
		<category><![CDATA[Automotive Systemarchitektur]]></category>
		<category><![CDATA[DRAM-Technologie]]></category>
		<category><![CDATA[driverless vehicles]]></category>
		<category><![CDATA[Echtzeit]]></category>
		<category><![CDATA[environmental sensors]]></category>
		<category><![CDATA[fahrerlose Fahrzeuge]]></category>
		<category><![CDATA[Fischaugenobjektiv]]></category>
		<category><![CDATA[Fisheye lenses]]></category>
		<category><![CDATA[fully automated vehicle]]></category>
		<category><![CDATA[highly automated vehicle]]></category>
		<category><![CDATA[hochautomatisiertes Fahrzeug]]></category>
		<category><![CDATA[künstliche Intelligenz]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[maschinelles lernen]]></category>
		<category><![CDATA[Multi-Fischaugen-Stereo-Kamera-Sensorsystem]]></category>
		<category><![CDATA[Multi-Fischaugen-Stereosystem]]></category>
		<category><![CDATA[multi-stereo sensor system]]></category>
		<category><![CDATA[Multiple fisheye-stereo cameras]]></category>
		<category><![CDATA[real time]]></category>
		<category><![CDATA[Sensorsystem]]></category>
		<category><![CDATA[Umfeldsensoren]]></category>
		<category><![CDATA[vollautomatisiert]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9152</guid>

					<description><![CDATA[<p>Today’s automated cars are already able to react faster than their human drivers. This is&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/multiple-fisheye-stereo-cameras/">Multiple fisheye-stereo cameras</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Today&rsquo;s automated cars are already able to react faster than their human drivers. This is made possible by extremely powerful on-board computers that collate data from all sensor systems, evaluate them in a matter of split seconds and initiate appropriate actions.</strong></p>
<p>If a child runs into the road, a human driver takes 1.6 seconds on average to depress the brake pedal. But, highly automated vehicles equipped with <a href="https://future-markets-magazine.com/en/encyclopedia/radar/" target="_blank" title="Radio detection and ranging" class="encyclopedia">radar</a> or <a href="https://future-markets-magazine.com/en/encyclopedia/lidar/" target="_blank" title="A method of optical distance and speed measurement related to radar. It involves the emission&hellip;" class="encyclopedia">lidar</a> sensors and a camera system already react in 0.5 seconds.</p>
<h2><strong>Real-time environmental perception</strong></h2>
<p>This is only possible if the data on board the vehicle itself is processed in powerful embedded image processing computers. And which continuously generate a complete picture of the surrounding traffic situation in real time. After all, you have to consider that a fully automated vehicle generates between 30 and 40 terabytes of data per eight hours of driving. That is around 3,500 4K movies. This makes it clear that the current, but also a future 5G network architecture is overwhelmed with this kind of volume. In addition, cars with automated-driving functions simply cannot afford to wait for information stored and processed in the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a>.</p>
<h2><strong>AI is a crucial module</strong></h2>
<p>According to Robert Bielby, Senior Director for Automotive-System Architectures in Micron&rsquo;s Embedded Business Unit, high-performance computers featuring artificial intelligence with deep neural-network <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s will enable autonomous cars to drive better than vehicles controlled by humans. &ldquo;You&rsquo;ve got a host of different sensors that work together to see the entire environment in 360 degrees, 24/7. At a greater distance and with a higher accuracy, than humans can,&rdquo; Bielby says. &ldquo;Combined with the extreme compute performance that today can be deployed in a car. And you have a situation where it is possible for cars to do a far better job of driving down the road with greater safety than we can.&rdquo;</p>
<p>Micron Technology offers a broad portfolio of volatile and non-volatile memory products for automotive applications. They are also used, for example, in a computing platform specially developed for autonomous driving and based on high-performance DRAM-technology.</p>
<h2><strong>Hundreds of trillions of computing operations per second</strong></h2>
<p>Among other things, the AI platform delivers the computing power needed by Daimler&rsquo;s highly automated vehicles. Artificial intelligence is an important building block in fully automated and driverless vehicles&rsquo; <a href="https://future-markets-magazine.com/en/encyclopedia/ecu/" target="_blank" title="Intelligent processor-controlled units and modules in the automotive sector which monitor specific functions and report&hellip;" class="encyclopedia">ECU</a> networks. Each of which consists of multiple individual control units. Overall, the <a href="https://future-markets-magazine.com/en/encyclopedia/ecu/" target="_blank" title="Intelligent processor-controlled units and modules in the automotive sector which monitor specific functions and report&hellip;" class="encyclopedia">ECU</a> network in Daimler vehicles achieves a computing capacity amounting to hundreds of trillions of operations per second. This equates to as much as the performance of at least six interconnected, state-of-the-art computer workstations.</p>
<p>Information from various environmental sensors merges with <a href="https://future-markets-magazine.com/en/encyclopedia/radar/" target="_blank" title="Radio detection and ranging" class="encyclopedia">radar</a>, video, <a href="https://future-markets-magazine.com/en/encyclopedia/lidar/" target="_blank" title="A method of optical distance and speed measurement related to radar. It involves the emission&hellip;" class="encyclopedia">lidar</a> and ultrasound technology here, for example. The <a href="https://future-markets-magazine.com/en/encyclopedia/ecu/" target="_blank" title="Intelligent processor-controlled units and modules in the automotive sector which monitor specific functions and report&hellip;" class="encyclopedia">ECU</a> network compiles the data from all environmental sensors. The <a href="https://future-markets-magazine.com/en/encyclopedia/sensor-fusion/" target="_blank" title="The intelligent convergence and processing of all (environmental) sensor data required for autonomous processes. The&hellip;" class="encyclopedia">sensor fusion</a> evaluates it in milliseconds and then uses this information to plan the vehicle&rsquo;s travel path. This is comparable to the speed of a pain stimulus in humans. Which takes between 20 and 500 milliseconds to reach the brain.</p>
<h2><strong>Braking in under 10 milliseconds </strong></h2>
<p>Research into even faster systems is being conducted, however. After all, an automated car reacts within 500 milliseconds. However, at a speed of 50 km/h, it still travels seven meters without braking. With this in mind, the Fraunhofer Institute for Reliability and Microintegration (IZM) is working on a camera-<a href="https://future-markets-magazine.com/en/encyclopedia/radar/" target="_blank" title="Radio detection and ranging" class="encyclopedia">radar</a> module. That can register changes in the traffic situation much more rapidly. The module &ndash; which is roughly the size of a mobile phone &ndash; will have a response time of under 10 milliseconds.</p>
<p>Integrated signal processing is to thank for this. Data from the <a href="https://future-markets-magazine.com/en/encyclopedia/radar/" target="_blank" title="Radio detection and ranging" class="encyclopedia">radar</a> system and the <a href="https://future-markets-magazine.com/en/encyclopedia/stereo-camera/" target="_blank" title="When a stereo camera captures an object, there is a spatial disparity between corresponding points&hellip;" class="encyclopedia">stereo camera</a> are processed and filtered directly in (or on) the module. While irrelevant information is detected, it is not passed along. The data from the camera and <a href="https://future-markets-magazine.com/en/encyclopedia/radar/" target="_blank" title="Radio detection and ranging" class="encyclopedia">radar</a> are merged through <a href="https://future-markets-magazine.com/en/encyclopedia/sensor-fusion/" target="_blank" title="The intelligent convergence and processing of all (environmental) sensor data required for autonomous processes. The&hellip;" class="encyclopedia">sensor fusion</a>. Underpinned by neural networks, the content of these data is evaluated using <a href="https://future-markets-magazine.com/en/encyclopedia/machine-learning/" target="_blank" title="Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them&hellip;" class="encyclopedia">machine learning</a>. The system does not subsequently send any status information to the vehicle, rather only instructions for how to react. As such, the bus line of the vehicle remains free for important signals. Such as a child that suddenly runs out into the road.</p>
<p>&ldquo;This integrated signal processing shortens the response time enormously,&rdquo; says Christian Tschoban, Group Leader in RF &amp; Smart Sensor Systems at the IZM. Once finished, the system is intended to be 50 times faster than conventional sensor systems and 160 times as fast as a human being. A car would then only travel for a further 15 centimetres before the system would kick in and send a signal to brake, which could help many accidents in urban traffic to be avoided.</p>
<h2><strong>One eye fixed firmly on the surroundings</strong></h2>
<p class="p1"></p><div class="su-box su-box-style-default" style="border-color:#006466;border-radius:3px"><div class="su-box-title" style="background-color:#007D7F;color:#FFFFFF;border-top-left-radius:1px;border-top-right-radius:1px">Multiple fisheye-<a href="https://future-markets-magazine.com/en/encyclopedia/stereo-camera/" target="_blank" title="When a stereo camera captures an object, there is a spatial disparity between corresponding points&hellip;" class="encyclopedia">stereo camera</a>s</div><div class="su-box-content su-clearfix" style="border-bottom-left-radius:1px;border-bottom-right-radius:1px">
<p>An alliance of companies and institutes &ndash; including MicroSys Electronics GmbH, TheImagingSource, Myestro, NewTec and the Institute for Laser Technology in Medicine and Measurement Technique (ILM) in Ulm &ndash; has developed a sensor system with multiple fisheye-<a href="https://future-markets-magazine.com/en/encyclopedia/stereo-camera/" target="_blank" title="When a stereo camera captures an object, there is a spatial disparity between corresponding points&hellip;" class="encyclopedia">stereo camera</a>s. That companies from all manner of markets can deploy in highly automated vehicles. Without needing to build up specific expertise in environmental recognition themselves.</p>
<p>The system comprises several multi-stereo sensor systems. Each one of which features four quadrilaterally arranged cameras in addition to a dedicated laser system. Cameras from neighbouring multi-stereo systems that are further apart together form additional stereo pairs for longer ranges. Fisheye lenses keep the required number of multi-<a href="https://future-markets-magazine.com/en/encyclopedia/stereo-camera/" target="_blank" title="When a stereo camera captures an object, there is a spatial disparity between corresponding points&hellip;" class="encyclopedia">stereo camera</a>s low. This configuration patented by Myestro enables both close- and long-range obstacles to be measured simultaneously. To compensate for vibrations in the vehicle body that would otherwise prevent usable image-information recordings, Myestro has developed the &ldquo;RubberStereo-technology&rdquo;. It immediately detects and offsets vibrations in real time by comparing the image data from the pair of cameras. The system runs on an embedded computer from MicroSys. Which, in turn, is built around a processor developed especially for automotive-vision applications. The platform combines signal-processing and computing functions for environmental detection with a very compact form factor.</p></div></div>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/multiple-fisheye-stereo-cameras/">Multiple fisheye-stereo cameras</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>What is AlfES?</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/what-is-alfes/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Thu, 31 Dec 2020 17:00:50 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AlfES]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[chipdesign]]></category>
		<category><![CDATA[chips]]></category>
		<category><![CDATA[chipsoftware]]></category>
		<category><![CDATA[Computerchips]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Eyeriss]]></category>
		<category><![CDATA[ki]]></category>
		<category><![CDATA[Kleinstelektronen]]></category>
		<category><![CDATA[künstliche Intelligenz]]></category>
		<category><![CDATA[miniaturised electronics]]></category>
		<category><![CDATA[nano drone]]></category>
		<category><![CDATA[Nanodrohne]]></category>
		<category><![CDATA[Navion]]></category>
		<category><![CDATA[was ist alfes?]]></category>
		<category><![CDATA[What is AlfES?]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9050</guid>

					<description><![CDATA[<p>The concurrent development and design of chips and software has enabled researchers to realise chips&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/what-is-alfes/">What is AlfES?</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>The concurrent development and design of chips and software has enabled researchers to realise chips that are not only uncommonly small and energy-efficient. But also boast powerful AI capabilities extending right through to training. One possible application scenario might be in nano drones that can navigate their way through a room independently.</strong></p>
<p>To the untrained eye, the chip does not look any different from the ones found in any ordinary electronic device. Yet the chips developed by the Massachusetts Institute of Technology (MIT) and named Eyeriss and Navion are a revelation. And might just be the key to the future of artificial intelligence. Using these chips, even the most diminutive <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices could be equipped with powerful &ldquo;smart&rdquo; abilities. The likes of which have only been able to be provided from gigantic data centres until now.</p>
<h2><strong>Energy efficiency is vital</strong></h2>
<p>Professor Vivienne Sze from the MIT Department of Electrical Engineering and Computer Science (EECS) &ndash; and a member of the development team &ndash; is keen to emphasise that the real opportunity offered by these chips is not in their impressive capability for <a href="https://future-markets-magazine.com/en/encyclopedia/deep-learning/" target="_blank" title="Sub-area of machine learning in which deep neural networks are used. Whilst machine learning works&hellip;" class="encyclopedia">deep learning</a>. It is much more about their energy efficiency. The chips need to master computationally intensive <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s and make do with the energy available on the <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices themselves. However, this is the only way that AI can find widespread application on the &ldquo;edge&rdquo;. The performance of the Eyeriss chip is 10 or even 1,000 times more efficient than current hardware.</p>
<h2><strong>A symbiosis of software and hardware</strong></h2>
<p>In Professor Sze&rsquo;s lab, research is also underway to determine how software ought to be designed to fully harness the power of computer chips. A low-power chip called Navion was developed at MIT to answer this question. To name one potential application, this chip could be used to navigate a drone using 3D-maps with previously unthinkable efficiency. But above all, such a drone is minute and no larger than a bee.</p>
<p>The concurrent development of the AI software and hardware was a crucial in this instance. This enabled researchers to build a chip just 20 square millimetres in size in the form of Navion. This chip only requires 24 milliwatts of power. That is around a thousandth of the energy consumed by a light bulb. The chip can use this tiny amount of power to process up to 171 camera images per second. Additionally, it perform inertial measurements &ndash; all in real time. Using this data, it calculates its position in the room. It might also be conceivable for the chip to be incorporated into a small pill that could gather and evaluate data from inside the human body once swallowed.</p>
<p>The chip achieves this level of efficiency through a variety of measures. For one thing, it minimises the data volume &ndash; taking the form of camera images and inertial measurements. That is stored on the chip at any given point in time. The developer team was also able to physically reduce the bottleneck. Between the data&rsquo;s storage location and the location in which they are analysed. Not to mentioned coming up with clever schemes for the data to be re-used. The way that these data flow through the chip has also been optimised. Certain computation steps are also skipped, such as the computation of zeros that result in a zero.</p>
<h2><strong>A basis for self-learning, miniaturised electronics</strong></h2>
<p>Research into how AI can be more effectively integrated into edge devices is also being conducted in other institutes. Accordingly, a team of researchers at the Fraunhofer Institute for Microelectronic Circuits and Systems (IMS) has developed a kind of artificial intelligence for microcontrollers and sensors. That comprises a fully configurable, artificial neural network.</p>
<p>This solution &ndash; called AIfES &ndash; is a platform-independent machine-learning library. Using which self-learning, miniaturised electronics that do not require any connection to <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> infrastructure or powerful computers can be realised. The library constitutes a fully configurable, artificial neural network. This network can also generate appropriately deep networks for <a href="https://future-markets-magazine.com/en/encyclopedia/deep-learning/" target="_blank" title="Sub-area of machine learning in which deep neural networks are used. Whilst machine learning works&hellip;" class="encyclopedia">deep learning</a> if needed. The source code has been reduced to a minimum, meaning that the AI can even be trained on the microcontroller itself. Thus far, this training phase has only been possible in data centres. AIfES is not concerned with processing large data volumes. Instead, it is now much more a case of the strictly required data being transferred in order to set up very small neural networks.</p>
<h2>What is AlfES?</h2>
<p class="p1"></p><div class="su-box su-box-style-default" style="border-color:#006466;border-radius:3px"><div class="su-box-title" style="background-color:#007D7F;color:#FFFFFF;border-top-left-radius:1px;border-top-right-radius:1px">AlfES</div><div class="su-box-content su-clearfix" style="border-bottom-left-radius:1px;border-bottom-right-radius:1px">In technical terms, the possibilities for applying AlfES are almost limitless. For example, an armband with integrated gesture recognition might be used to control the lighting in buildings. However, it is also possible to monitor how well gestures are performed as well as merely recognising them. In rehabilitation or fitness applications, exercises and sequences of movements could even be evaluated without a trainer present. The fact that there is no camera or <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> involved means that privacy is guaranteed. AIfES can be used in many fields, from automotive and medical applications to smart homes or <a href="https://future-markets-magazine.com/en/encyclopedia/industry-4-0/" target="_blank" title="also known as Smart Manufacturing" class="encyclopedia">industry 4.0</a>.</div></div>
<p>The team of researchers has already produced several demonstrators, including one on a cheap 8-bit microcontroller for detecting hand-written numbers. An additional demonstrator can detect complex gestures and numbers written in the air. For this application, scientists at the IMS developed a detection system comprising a microcontroller and an absolute orientation sensor. To begin with, multiple people are required to write the numbers from zero to nine several times over. The neural network detects these written patterns, learns them and autonomously identifies them in the next step.</p>
<p>Studies conducted at the research institutes provide an outlook for how AI software and hardware will continue to develop symbiotically in the future and open up complex AI functions in <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> and edge devices in the process.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/what-is-alfes/">What is AlfES?</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Demands on intralogistics are growing</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/demands-on-intralogstics-are-growing/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Thu, 31 Dec 2020 07:00:31 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[assistent systems]]></category>
		<category><![CDATA[Assistenzsysteme]]></category>
		<category><![CDATA[autarke Steuerung]]></category>
		<category><![CDATA[automated guided vehicles]]></category>
		<category><![CDATA[autonom]]></category>
		<category><![CDATA[conveyor system]]></category>
		<category><![CDATA[e-Cart]]></category>
		<category><![CDATA[Fahrerloses Transportsystem]]></category>
		<category><![CDATA[Fördersystem]]></category>
		<category><![CDATA[FTS]]></category>
		<category><![CDATA[industrie 4.0]]></category>
		<category><![CDATA[industry 4.0]]></category>
		<category><![CDATA[Intralogistics]]></category>
		<category><![CDATA[Intralogistik]]></category>
		<category><![CDATA[Kinexon Industries]]></category>
		<category><![CDATA[logistics systems]]></category>
		<category><![CDATA[Logistik Systeme]]></category>
		<category><![CDATA[selbstfahrend]]></category>
		<category><![CDATA[sensor]]></category>
		<category><![CDATA[transport system]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9059</guid>

					<description><![CDATA[<p>The demands on intralogistics are growing in the Industry 4.0 era. Robust computer systems integrated&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/demands-on-intralogstics-are-growing/">Demands on intralogistics are growing</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>The demands on intralogistics are growing in the <a href="https://future-markets-magazine.com/en/encyclopedia/industry-4-0/" target="_blank" title="also known as Smart Manufacturing" class="encyclopedia">Industry 4.0</a> era. Robust computer systems integrated in transport vehicles ensure that vehicles are flexible and highly available to meet these demands. They even allow assistance systems in forklifts as have only been familiar to date in high-end cars.</strong></p>
<p>Maximum availability and flexibility are right at the top of the list of requirements for intralogistics. Against this backdrop, the company Krups F&ouml;rdersysteme has developed an intelligent conveyor system that breaks &ndash; or travels &ndash; completely new ground. The e-Cart transport system consists of self-propelled smart workpiece carriers that travel on a passive conveyor.</p>
<div class="su-note" style="border-color:#007072;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-clearfix" style="background-color:#007D7F;border-color:#cce5e5;color:#ffffff;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;">The e-cart transport system is designed especially to meet the requirements of assembly and test automation. Typical applications include battery production, powertrain components such as engines, transmissions and axes. The load capacity is between 100 and 1,350 kg/e-Cart for a base item of 1,000 x 700 mm.</div></div>
<p>The e-Cart has its own obstacle detection and stops on contact with an active brake. Fitted with a self-diagnosis function, each e-Cart can detect when maintenance is due or if there is a potential fault. It then can be discharged automatically to a separate maintenance station as needed. All movements and queries are controlled autonomously via an integrated smart module, which is based on a block-controller from Turck. Only the desired destination is predefined by the higher-level customer control via a data bus. The maintenance-free running rails with their smart, powered cart, bundled with decentrally controlled standard system components, ensure reliable assembly automation. This increases availability and allows flexible processes with minimum lot sizes.</p>
<h2><strong>Precise positioning for driverless transport systems</strong></h2>
<p>Driverless vehicles, which transport goods automatically through the factory, are an integral element of many <a href="https://future-markets-magazine.com/en/encyclopedia/industry-4-0/" target="_blank" title="also known as Smart Manufacturing" class="encyclopedia">Industry 4.0</a> logistics concepts. They require detailed position and static data to operate smoothly. Navigation systems to date have used ground markers for this purpose, such as lines or magnetic strips. But, soil contamination, disruptive light received or dust make it more difficult for the driverless transport system to recognise its path.</p>
<p>Kinexon Industries adopted a new approach for its driverless transport system navigation software Kinexon Brain. It uses sensor-based positioning and consolidates several position sensors. On one hand, this involves data from the inertial navigation system. And on the other hand laser data. Both systems are supplemented by position information from a wireless location system. It can identify the position and status of objects based on ultra-wideband positioning. The navigation software automatically chooses the best location information in every situation and combines this to estimate positions precisely. Large volumes of data are required for such a complex navigation system to be able to function. And it must be possible to process this data quickly and efficiently, even under difficult ambient conditions.</p>
<p>Kinexon uses an A-series KBox from Kontron, one of the world&rsquo;s leading providers of embedded computer technology. Thanks to the compact design, the <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a>-enabled box PC is especially suited for integration &nbsp;in low, autonomous forklift trucks. One highlight of the KBox is that it operates without a fan. Flat, autonomous industrial trucks travel close to the floor, which means they have to cope with a lot of dust. As a result, an industrial PC equipped with a fan is not an ideal solution here. However, the KBox from Kontron offers good passive ventilation, meaning it can be used at ambient temperatures of up to 60 degrees Celsius without being affected by dust.</p>
<h2><strong>Accident-free driving for forklifts</strong></h2>
<p>High-performance edge computers are not only used in driverless, automated intralogistics systems however. They can also help drivers of conventional forklift trucks, such as the example of the safety kit for forklifts developed by Via Technologies. The system was developed for the Xingchen Hongye Technology Development Company in China, whose aim was to enhance the safety of the existing forklift fleet with the solution called Via Mobile360.</p>
<p>One of the key approaches in this context was the development of a driver monitoring system. It ensures that persons behind the steering wheel devotes their full attention to operating the vehicle. A camera monitors the driver&rsquo;s cab and conducts a real-time video stream to an on-board computer. Using corresponding <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s, it recognises whether the driver is on the phone, distracted or tired. The system warns the driver if necessary both visually and acoustically. It even reminds drivers to wear their safety belts. Driver <a href="https://future-markets-magazine.com/en/encyclopedia/authentication/" target="_blank" title="Ensures that the communication partner at the other end is authentic." class="encyclopedia">authentication</a> was furthermore integrated in order to ensure that the forklift is only used by trained drivers.</p>
<p>The monitoring system was enhanced by facial recognition <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s for this purpose. The camera has to recognise a driver&rsquo;s face as the authorised operator before the driver can commence their shift. A surround view system with four cameras allows the driver to maintain a 360&deg; overview of the surroundings and therefore prevents accidents with other employees or vehicles. The system uses high-tech 3D <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s, which ensure that a sharp and clear panorama picture is shown on a display in the cockpit.</p>
<h2>Safety solution with ADAS</h2>
<p>The final building block of the safety solution is the enhancement of the on-board system with ADAS (Advanced Driver Assistance System) functions for alerting to collisions from the front, rear and side. Thanks to the additional three cameras, the driver&rsquo;s attention can be drawn to situations in which the forklift comes dangerously close to objects. The cameras essentially act as sensors, which detect potential collisions and send warnings directly to the cockpit display. This also includes dynamic recognition of moving objects so that the system can warn the driver when an object or person is moving in the direction of the forklift. All <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s run on a high-performance and scalable on-board computer system.</p>
<p>This example too shows how edge computing software can ensure safer, higher-performance and highly available intralogistics.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/demands-on-intralogstics-are-growing/">Demands on intralogistics are growing</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Edge Computing helps AI take off</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/edge-computing-helps-ai-take-off/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Wed, 30 Dec 2020 17:00:39 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Applications]]></category>
		<category><![CDATA[AI Edge Devices]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[cloud infrastructure]]></category>
		<category><![CDATA[Cloud-Infrastruktur]]></category>
		<category><![CDATA[edge computing]]></category>
		<category><![CDATA[edge devices]]></category>
		<category><![CDATA[Edge Geräte]]></category>
		<category><![CDATA[fog-computing]]></category>
		<category><![CDATA[ki]]></category>
		<category><![CDATA[KI-Applikationen]]></category>
		<category><![CDATA[KI-Edge Geräte]]></category>
		<category><![CDATA[künstliche Intelligenz]]></category>
		<category><![CDATA[Mustererkennung]]></category>
		<category><![CDATA[Pattern recognition]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9098</guid>

					<description><![CDATA[<p>Nowadays, edge devices can run AI applications thanks to increasingly powerful hardware. As such, there&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/edge-computing-helps-ai-take-off/">Edge Computing helps AI take off</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Nowadays, edge devices can run AI applications thanks to increasingly powerful hardware. As such, there is no longer any need to transfer data to the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a>. Which makes AI applications in real time a reality, reduces costs and brings advantages in terms of <a href="https://future-markets-magazine.com/en/encyclopedia/data-security/" target="_blank" title="Laws and technical measures aimed at preventing the unauthorised storage, processing and distribution of sensitive&hellip;" class="encyclopedia">data security</a>.&nbsp; </strong></p>
<p>Today, Artificial Intelligence (AI) technology is used in a wide range of applications. Where it opens up new possibilities for creating added value. AI is used to predictively analyse the behaviour of users on social networks. Which enables ads matching their needs or interests to be shown. Even facial- and voice-recognition features on smartphones would not work without artificial intelligence. In industrial applications, AI applications help make maintenance more effective by predicting machine failures before they even occur. According to a white paper by investment bank Bryan, Garnier &amp; Co., 99 per cent of AI-related semiconductor hardware was still centralised in the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> as recently as 2017.</p>
<h2><strong>The difference between training and inference</strong></h2>
<p>One of Artificial Intelligence&rsquo;s most important functions is <a href="https://future-markets-magazine.com/en/encyclopedia/machine-learning/" target="_blank" title="Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them&hellip;" class="encyclopedia">Machine Learning</a>. With which IT systems can use existing data pools to detect <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s, patterns and rules. In addition to coming up with solutions. <a href="https://future-markets-magazine.com/en/encyclopedia/machine-learning/" target="_blank" title="Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them&hellip;" class="encyclopedia">Machine Learning</a> is a two-stage process. In the training phase, the system is initially &ldquo;taught&rdquo; to identify patterns in a large data set. The training phase is a long-term task that requires a large amount of computing power. After this phase, the Machine-Learning system can apply the final, trained model to analyse and categorise new data. And finally to derive a result. This step &ndash; which is known as <a href="https://future-markets-magazine.com/en/encyclopedia/inference/" target="_blank" title="Phase of application of artificial intelligence. After the system has been trained, it calls on&hellip;" class="encyclopedia">inference</a> &ndash; requires much less computing power.</p>
<h2><strong>Cloud infrastructure cannot handle AI requirements alone</strong></h2>
<p>Most <a href="https://future-markets-magazine.com/en/encyclopedia/inference/" target="_blank" title="Phase of application of artificial intelligence. After the system has been trained, it calls on&hellip;" class="encyclopedia">inference</a> and training steps today are executed in the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a>. For example, in the case of a virtual assistant, the user&rsquo;s command is sent to a data centre. Then analysed there with the appropriate <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s and sent back to the device with the appropriate response. Until now, the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> has remained the most efficient way to exploit the benefits of powerful, cutting-edge hardware and software. Yet the increasing number of AI applications is threatening to overload current <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> infrastructure.</p>
<p>For instance, if every Android device in the world were to execute a voice-recognition command every three minutes. Google would need to make twice as much computing power available as it currently does. &ldquo;In other words, the world&rsquo;s largest computing infrastructure would have to double in size,&rdquo; explains Jem Davies, Vice President, Fellow and General Manager of the ARM <a href="https://future-markets-magazine.com/en/encyclopedia/machine-learning/" target="_blank" title="Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them&hellip;" class="encyclopedia">Machine Learning</a> Group. &ldquo;Also, demands for seamless user experiences mean people won&rsquo;t accept the latency inherent in performing ML processing in the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a>.&rdquo;</p>
<h2><strong>The number of AI edge devices is set to boom</strong></h2>
<p><a href="https://future-markets-magazine.com/en/encyclopedia/inference/" target="_blank" title="Phase of application of artificial intelligence. After the system has been trained, it calls on&hellip;" class="encyclopedia">Inference</a> tasks are increasingly being relocated to the edge as a result of this. Which enables the data to be processed there without even needing to be transferred anywhere else. &ldquo;The act of transferring data is inherently costly. In business-critical use cases where latency and accuracy are key and constant connectivity is lacking, applications can&rsquo;t be fulfilled. Locating AI <a href="https://future-markets-magazine.com/en/encyclopedia/inference/" target="_blank" title="Phase of application of artificial intelligence. After the system has been trained, it calls on&hellip;" class="encyclopedia">inference</a> processing at the edge also means that companies don&rsquo;t have to share private or sensitive data with <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> providers. Something that is problematic in the healthcare and consumer sectors,&rdquo; explains Jack Vernon, Industry Analyst at ABI Research.</p>
<p>According to market researchers at Tractica, this means that annual deliveries of edge devices with integrated AI will increase from 161.4 million in 2018 to 2.6 billion in 2025. Smartphones will account for the largest share of these unit quantities, followed by smart speakers and laptops.</p>
<h2><strong>Smartphones making strides</strong></h2>
<p>A good example for demonstrating the sheer variety of possible applications for AI in edge devices are smartphones. To name one example, AI enables the camera of the Huawei P smart 2019 to detect 22 different subject types and 500 scenarios. In order to optimise settings and take the perfect photo. In the Samsung Galaxy S10 5G, on the other hand, AI automatically adapts the battery, processor performance, memory usage and device temperature to the user&rsquo;s behaviour.</p>
<p>Gartner also names a few other potential applications, including user <a href="https://future-markets-magazine.com/en/encyclopedia/authentication/" target="_blank" title="Ensures that the communication partner at the other end is authentic." class="encyclopedia">authentication</a>. In the future, smartphones might be able to record and learn about a user&rsquo;s behaviour. Such as patterns when walking or when scrolling and typing on the touch screen. All this, without any passwords or active <a href="https://future-markets-magazine.com/en/encyclopedia/authentication/" target="_blank" title="Ensures that the communication partner at the other end is authentic." class="encyclopedia">authentication</a> being required.</p>
<p>Emotion sensors and affective computing make a lot possible for smartphones. To detect, analyse and process people&rsquo;s emotional states and moods before reacting to them.</p>
<div class="su-note" style="border-color:#007072;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-clearfix" style="background-color:#007D7F;border-color:#cce5e5;color:#ffffff;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;">With affective computing, human emotions will be interpreted and simulated. This is an interdisciplinary approach incorporating IT, psychology and cognitive science. And concerned with the interaction between humans and machines or computers, respectively.</div></div>
<p>For instance, vehicle manufacturers might use the front camera of a smartphone to interpret a driver&rsquo;s mental state or assess signs of fatigue. In order to improve safety. Voice control or <a href="https://future-markets-magazine.com/en/encyclopedia/augmented-reality/" target="_blank" title="A combination of the perceived real world and virtual reality generated by computer. Users are&hellip;" class="encyclopedia">augmented reality</a> are other potential applications for AI in smartphones.</p>
<p>CK Lu, Research Director at Gartner is certain of one thing: &ldquo;Future AI capabilities will allow smartphones to learn, plan and solve problems for users. This isn&rsquo;t just about making the smartphone smarter, but augmenting people by reducing their cognitive load. However, AI capabilities on smartphones are still in very early stages.&rdquo;</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/edge-computing-helps-ai-take-off/">Edge Computing helps AI take off</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>ACAPs – Adaptive Compute Acceleration Platforms</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/acaps-adaptive-compute-acceleration-platforms/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Wed, 30 Dec 2020 07:00:03 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[ACAP]]></category>
		<category><![CDATA[Adaptive Compute Acceleration Platform]]></category>
		<category><![CDATA[Application Specific Integrated Circuit]]></category>
		<category><![CDATA[ASIC]]></category>
		<category><![CDATA[chip generation]]></category>
		<category><![CDATA[edge computing]]></category>
		<category><![CDATA[Embedded Systeme]]></category>
		<category><![CDATA[Embedded Systems]]></category>
		<category><![CDATA[Field Programmable Gate Array]]></category>
		<category><![CDATA[fog-computing]]></category>
		<category><![CDATA[FPGA]]></category>
		<category><![CDATA[Gflops]]></category>
		<category><![CDATA[Moore'sches Gesetz]]></category>
		<category><![CDATA[Moores Law]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9083</guid>

					<description><![CDATA[<p>ACAPs are highly-integrated, heterogeneous multi-core computing platforms that significantly extend the capabilities of FPGAs and&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/acaps-adaptive-compute-acceleration-platforms/">ACAPs – Adaptive Compute Acceleration Platforms</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>ACAPs are highly-integrated, heterogeneous multi-core computing platforms that significantly extend the capabilities of <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s and operate more efficiently than others. They are the next generation of even more powerful platforms that are already entering the market. Today&rsquo;s microprocessors are so powerful and inexpensive that almost any device can be equipped with them. At present, the constant, ongoing development of these components is even enabling edge devices to be kitted out with AI. </strong></p>
<p><a href="https://future-markets-magazine.com/en/encyclopedia/embedded-system/" target="_blank" title="Hardware and software components integrated into a unified system to implement system-specific functional features." class="encyclopedia">Embedded system</a>s and Edge Computing can only be realised for one reason. Because, microprocessors with increasing amounts of computing power are currently available with ever smaller price tags.</p>
<p>Moore&rsquo;s law has held true for over 50 years now. It describes how the performance of computer and memory chips should double every 12 to 24 months. Accordingly, the first 4-bit processors launched onto the market in the late 1960s. They had 2,250 transistors &ldquo;on board&rdquo; and a clock frequency of 740 kilohertz.&nbsp; At regular intervals since then, processors have been launched onto the market. That enable a doubling of the quantity of processed information in the same clock period. First 8-bit, then 16-bit and then 32-bit.</p>
<p>At the turn of the millennium, 64-bit architecture also became available for computers. The most powerful processors currently available feature around 20 billion (!) transistors and clock frequencies of more than 4.5 gigahertz. This development was accompanied by a drastic reduction in cost. In 1961, there was an astonishing price of USD 145.5 billion (adjusted for inflation) to be paid for a GFlop. Which corresponds to one billion computing operations per second. Whereas the price today is a matter of mere cents. This means that even inexpensive, mass-produced products can be easily equipped with chips today. And that sufficient computing power is available for Edge Computing.</p>
<h2><strong>Lower energy consumption with ever higher performance</strong></h2>
<p>Nonetheless, Edge Computing is not just concerned with chip performance, but also with energy efficiency. Needless of say, many smart devices are battery-powered. ASICs (application-specific integrated circuits) offer the highest efficiency. However, these cannot be re-configured if and when requirements change.</p>
<p>That is why so-called <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s are increasingly replacing multi-purpose processors for Edge Computing. These field-programmable gate arrays are integrated circuits whose logic can be reconfigured after production. Unlike processors, <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s are capable of parallel data processing with their multiple, programmable logic blocks. In this case, every single processing task is assigned to a dedicated area on the chip. Which enables it to be autonomously executed. In doing so, they consume considerably less power than <a href="https://future-markets-magazine.com/en/encyclopedia/cpu/" target="_blank" title="Central Processing Unit" class="encyclopedia">CPU</a>s. As such, <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s combine the flexibility and programmability of software running on a multi-purpose processor. And with the speed and energy efficiency of an ASIC.</p>
<p>Their ability to work through many tasks at once makes <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s seem pre-destined for AI applications. Voice control, image processing and <a href="https://future-markets-magazine.com/en/encyclopedia/augmented-reality/" target="_blank" title="A combination of the perceived real world and virtual reality generated by computer. Users are&hellip;" class="encyclopedia">augmented reality</a> are just a few examples of todays AI applications. They all require high computing power and low electricity consumption. Not to mention low latency to make the experience seem responsive and natural.</p>
<p>That is why the trend is heading toward transposing a growing number of AI applications from the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> to Edge Computing. GPUs, which are frequently used in data centres thanks to their parallel-data-processing abilities, require too much power for Edge Computing. Analysts from McKinsey anticipate that the market for AI hardware in edge applications will grow. From around USD 100 million in 2017 to USD 5.5 billion in 2025. At the same time, the major chip manufacturers who currently dominate the market for <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> solutions are increasingly facing stiff competition from other established companies.</p>
<h2>ACAPs as further development of FPGAs</h2>
<p>ACAPs are on the horizon as potential successors to <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s. These adaptive compute acceleration platforms are highly integrated, heterogeneous, multi-core computing platforms that considerably expand the possibilities of <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s. And they are substantially faster and more energy-efficient than <a href="https://future-markets-magazine.com/en/encyclopedia/cpu/" target="_blank" title="Central Processing Unit" class="encyclopedia">CPU</a>- and GPU-based platforms. ACAPs can be modified at a hardware level and adapted to a broad spectrum of applications and computational loads. Including dynamically during operation.</p>
<p>Although <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>s can only be equipped with a single logic circuit, an ACAP is suitable for accelerating a wide range of applications. Including those from the domain of AI. The first ACAP chips from Xilinx were shipped in summer 2019. They are based on 7-nanometer process technology and boast over 50 billion transistors.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/acaps-adaptive-compute-acceleration-platforms/">ACAPs – Adaptive Compute Acceleration Platforms</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>High-speed communication with 5G and Ethernet</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/high-speed-communication-with-5g-and-ethernet/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Tue, 29 Dec 2020 17:00:30 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[Passion for Technology]]></category>
		<category><![CDATA[5g]]></category>
		<category><![CDATA[Bussystem]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[Datenübertragungsrate]]></category>
		<category><![CDATA[Echtzeit-Ethernet]]></category>
		<category><![CDATA[Edge Knoten]]></category>
		<category><![CDATA[edge node]]></category>
		<category><![CDATA[Ethernet]]></category>
		<category><![CDATA[Ethernet-Technologie]]></category>
		<category><![CDATA[fog-computing]]></category>
		<category><![CDATA[Halbleitertechnologie]]></category>
		<category><![CDATA[highspeed communication]]></category>
		<category><![CDATA[Highspeed-Kommunikation]]></category>
		<category><![CDATA[IOT]]></category>
		<category><![CDATA[Mobile Edge Computing]]></category>
		<category><![CDATA[mobile industry]]></category>
		<category><![CDATA[Mobilfunkindustrie]]></category>
		<category><![CDATA[Mobilfunkstandard]]></category>
		<category><![CDATA[Netzwerk-Architektur]]></category>
		<category><![CDATA[real time Ethernet]]></category>
		<category><![CDATA[semiconductor technology]]></category>
		<category><![CDATA[wireless industry]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9107</guid>

					<description><![CDATA[<p>High-speed communication and Edge Computing are closely intertwined – you can’t have one without the&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/high-speed-communication-with-5g-and-ethernet/">High-speed communication with 5G and Ethernet</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>High-speed communication and Edge Computing are closely intertwined &ndash; you can&rsquo;t have one without the other. Ethernet ensures the fastest possible transfer of device data to the edge node. While Edge Computing enables the performance goals of the 5G telecommunications standard to be achieved in the first place.</strong></p>
<p>Even if it might be theoretically possible to equip every networked device with its own intelligence and sufficient processing capacity. And due to the rapid development of semiconductor technology, it still isn&rsquo;t worthwhile. Firstly, the cost of doing so would be prohibitive. Secondly, many applications require data from more than just one source. One example is predictive maintenance in an industrial setting. An analytical program can only predict a component failure once information is supplied by as many sensors as possible. And installed at various points on the machine.</p>
<p>This is why the data are collected and processed in a node &ndash; or a mini data centre. A network architecture of this kind is frequently referred to as Fog Computing. While others still consider it to be Edge Computing. Regardless, many of the applications that run here require high-performance communication systems. Using which the data from the sensors, <a href="https://future-markets-magazine.com/en/encyclopedia/actuator/" target="_blank" title="A component which converts electronic signals into mechanical motion or other physical quantities, such as&hellip;" class="encyclopedia">actuator</a>s or devices can be transferred to the mini data centre. And then back again in real time.</p>
<h2><strong>High-speed communication with Ethernet</strong></h2>
<p>Over the past few years, one communication system has been able to establish itself in many areas. Whether in factories, cars or buildings: Ethernet. When compared with traditional <a href="https://future-markets-magazine.com/en/encyclopedia/bus-system/" target="_blank" title="In a bus system, numerous end devices share a single data line (bus line). Since&hellip;" class="encyclopedia">bus system</a>s, Ethernet boasts higher speeds, improved handling of large data volumes and lower costs. Because of the technology&rsquo;s high energy efficiency. Not to mention its flexible and economical bandwidth options. Nowadays, Ethernet protocols enable high-speed communication with speeds of up to 400 Gbit/s through the use of optical-fibre cables. For industrial applications, real-time Ethernet and industrial Ethernet options have been developed that operate with the lowest possible latency.</p>
<p>In the words of John D&rsquo;Ambrosia, Chairman of the industrial consortium Ethernet Alliance: &ldquo;Ethernet encompasses a vibrant ecosystem of technologies integrating seamlessly to deliver the robust connectivity demanded by emerging markets and applications. And even as networks grow larger and faster, Ethernet is successfully meeting the multi-vendor challenges of these networks head-on. Ethernet has not only horsepower needed to support next-generation networks, but is mature enough to do so reliably and cost-effectively.&rdquo; Accordingly, the consortium assumes that high-speed communication speeds of up to 800 Gbit/s. Or even 1.6 Tbit/s will be possible in a few years.</p>
<h2><strong>A mobile-communications standard for the IoT</strong></h2>
<p>The connection of the devices and nodes networked via Ethernet with the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> will then be established using <a href="https://future-markets-magazine.com/en/encyclopedia/broadband/" target="_blank" title="Umbrella term for Internet access via networks operating at high data transfer rates, implemented as&hellip;" class="encyclopedia">broadband</a> cables. Or &ndash; in the case of mobile applications &ndash; via cellular infrastructure. The <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> will provide highly flexible, scalable resources such as storage or (non-time-critical) computing power in central data centres. However, the sheer number of future networked devices is already pushing today&rsquo;s data transfer systems to their limits. At least as far as the wireless technologies required for mobile applications are concerned. After all, there is the fastest mobile-communications network available today &ndash; <a href="https://future-markets-magazine.com/en/encyclopedia/lte/" target="_blank" title="Long Term Evolution" class="encyclopedia">LTE</a> or 4G. It can only support 2,000 active devices within an area of one square kilometre. However, the continuous growth in the number of networked devices in the <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> will necessitate considerably more active connections.</p>
<p>The solution lies in the 5G standard, which can support up to 100,000 active devices per square kilometre. With that figure even rising up to 1 million in the future. Yet even with a data transfer rate of up to 10 Gbit/s, 5G can only work in combination with Edge Computing. Firstly, with the gigantic number of networked devices that there is anticipated to be in the future, the mobile-communications network would simply be overloaded if all the data involved were sent to the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> to be processed. Secondly, applications such as autonomous driving or <a href="https://future-markets-magazine.com/en/encyclopedia/industry-4-0/" target="_blank" title="also known as Smart Manufacturing" class="encyclopedia">Industry 4.0</a> require latency times of less than 10 milliseconds, with some even as little as 1 millisecond. In order to achieve these response times, the transfer distances for 5G cannot be too far either.</p>
<h2><strong>A mini data centre in every cell tower</strong></h2>
<p>That is why 5G providers position data centres at the edge of their mobile-communications networks; more or less right at the 5G antenna itself. This enables them to host or provide a range of applications and services that benefit from low latency. At the same time, they can reduce the volume of data traffic that needs to be fed back to the core network. This reduces the cost of the data transfer &ndash; in short, edge computing saves mobile-communications providers money. It also leads to an improvement in the overall mobile user experience for consumers: the low latency results in a palpable acceleration of the response within mobile-communications networks. For instance, if TV shows and films were to be temporarily buffered at the edge, video <a href="https://future-markets-magazine.com/en/encyclopedia/streaming/" target="_blank" title="The continuous transfer of compressed data in the form of a data stream over the&hellip;" class="encyclopedia">streaming</a> on mobile devices would begin almost immediately.</p>
<p>To put it simply, 5G would not be able to achieve the targets of very low latency and massive <a href="https://future-markets-magazine.com/en/encyclopedia/broadband/" target="_blank" title="Umbrella term for Internet access via networks operating at high data transfer rates, implemented as&hellip;" class="encyclopedia">broadband</a> without edge computing. &ldquo;Edge computing only emerged on the wireless-industry stage several years ago, yet its significance is already great,&rdquo; says Iain Gillott, President and founder of specialist consulting company iGR. &ldquo;We believe that edge computing will ultimately be essential in realising the full promise of 5G.&rdquo;</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/high-speed-communication-with-5g-and-ethernet/">High-speed communication with 5G and Ethernet</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Data storage in the future</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/data-storage-in-the-future/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Tue, 29 Dec 2020 07:00:52 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[3D NAND Hersteller]]></category>
		<category><![CDATA[3D NAND manufacturers]]></category>
		<category><![CDATA[3D NAND memories]]></category>
		<category><![CDATA[3D NAND Speicher]]></category>
		<category><![CDATA[data storage]]></category>
		<category><![CDATA[data-storage technology]]></category>
		<category><![CDATA[Datenspeicherung]]></category>
		<category><![CDATA[flash memories]]></category>
		<category><![CDATA[magnetic memories]]></category>
		<category><![CDATA[magnetische Speicher]]></category>
		<category><![CDATA[memory chips]]></category>
		<category><![CDATA[NAND flash]]></category>
		<category><![CDATA[random spin]]></category>
		<category><![CDATA[Speicherbausteine]]></category>
		<category><![CDATA[spin]]></category>
		<category><![CDATA[USB flash drives]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9121</guid>

					<description><![CDATA[<p>Data storage on edge devices need robust and increasingly powerful memory chips. Flash has been&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/data-storage-in-the-future/">Data storage in the future</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Data storage on edge devices need robust and increasingly powerful memory chips. Flash has been a proven solution for many years. But new technologies and materials offer the promise of even faster and more durable memories.</strong></p>
<p>Suitable data memories are also needed for processing data on edge devices. But the memory chips must overcome particular challenges when it comes to storing the data on the devices.</p>
<p>First of all, they need to offer a combination of low cost and high performance. In addition, the storage media should be energy-efficient so that they can also be used in battery-powered devices. Edge devices must be able to withstand lower and also higher temperatures than is necessary with typical PC applications. They have to be impervious to vibrations and shocks too when used in mobile devices. Classic hard disk drives (HDD) are too sensitive in this regard &ndash; and also too large for most edge applications.</p>
<h2><strong>Flash is proven but pushed to its limits</strong></h2>
<p>Flash memories, as familiar from USB flash drives for example, have already proven their worth. These are semiconductor memories in which information (bits) is stored in a memory cell in the form of electrical charges. Whilst flash memories are still somewhat more expensive per gigabyte, they do offer considerably faster data processing processes than HDDs. The read/write speeds that can be achieved in a flash memory are four times higher than those with HDDs. Moreover, they offer a higher density. Thus saving space and weight, and are less susceptible to errors owing to the lack of moving parts.</p>
<p>The development of flash memories has focused in recent years on reducing the size of the cells. And therefore increasing the data density. However, a physical limit has already been reached here with the present-day nanometre structures. And the lifetime and reliability of the memories is suffering as a result. 3D-NAND memories represent one solution. These are memories where &ndash; in simple terms &ndash; the planar memory cells are stacked vertically in multiple layers. Because of the shorter connections between the memory cells, the storage capacity and storage speed can be increased. And the power consumption reduced, too.</p>
<p>&ldquo;To respond to the impressive demand for higher storage capacity and reliability while lowering cost per bit, 3D-NAND manufacturers implement innovative techniques&rdquo;, explains Belinda Dube, Cost Analyst at System Plus Consulting. &ldquo;They change the storage type, the memory cell design and stack more layers with each generation to increase bit density and hence reduce the die size. The technological changes in the cell architecture and modification of the fundamental memory functions add complexity to the manufacturing process. However these techniques do lower the cost per gigabyte.&rdquo;</p>
<h2>Data storage in the future</h2>
<p>Even the current 3D-NAND memories are increasingly reaching their limits. However, as applications become more complex, as the flood of data continues to grow and demands intensify rapidly.</p>
<p>&ldquo;Data-storage technology has reached a scaling limit. We need new concepts in order to store the data volumes&nbsp;we will produce in the future&rdquo;, explains Peter Zalden, Scientist at the European XFEL. XFEL operates the largest x-ray laser in the world. Together with researchers from the University of Duisburg-Essen, Zalden used this laser to investigate how data storage could become better and more efficient with new phase-change materials.</p>
<p>Phase-change memories store data by changing the aggregate state of the bits between liquid, vitreous and crystalline. An electromagnetic field, heat or light pulses switch back and forth between the phases. These two different states correspond to the 0 and 1 in binary code. Corresponding memories have the potential to be a thousand times faster and significantly more durable than existing flash memory chips and could allow future generations of smartphones to offer a higher storage density and greater energy efficiency.</p>
<p>Another promising technology is based on magnetic memories. Spin is used in this case, which is the intrinsic angular momentum of the particle. &ldquo;The spin is closely related to magnetism. It can be impacted by magnetic fields&rdquo;, says Dr. Viktor Sverdlov from the Institute for Microelectronics at the Vienna University of Technology. &ldquo;In the same way that information can be stored by applying different electrical charge at specific points, information can also be stored by ensuring different spin at specific points.&rdquo;</p>
<p>In contrast to flash memories, such memories can be written any number of times and allow very short write and read access times. However, these new storage technologies are still too expensive to be used on a mass scale in edge devices.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/data-storage-in-the-future/">Data storage in the future</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Embedded Software and its colourful world</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/embedded-software-and-its-colourful-world/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Sun, 27 Dec 2020 17:00:58 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[Alibaba Cloud]]></category>
		<category><![CDATA[application software]]></category>
		<category><![CDATA[Applikationssoftware]]></category>
		<category><![CDATA[AWS Greengrass]]></category>
		<category><![CDATA[Betriebssystem]]></category>
		<category><![CDATA[edge device]]></category>
		<category><![CDATA[Edge Gerät]]></category>
		<category><![CDATA[embedded device]]></category>
		<category><![CDATA[Embedded-Gerät]]></category>
		<category><![CDATA[IoT device]]></category>
		<category><![CDATA[IoT Software]]></category>
		<category><![CDATA[IoT-Gerät]]></category>
		<category><![CDATA[Linux]]></category>
		<category><![CDATA[Microsoft Azure iOT Edge]]></category>
		<category><![CDATA[operating system]]></category>
		<category><![CDATA[PREEMPT_RT]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[Ubuntu]]></category>
		<category><![CDATA[Windows 10 IoT]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9131</guid>

					<description><![CDATA[<p>Embedded Software and its colourful world. Like any other computers, embedded and edge devices also&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/embedded-software-and-its-colourful-world/">Embedded Software and its colourful world</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Embedded Software and its colourful world. Like any other computers, embedded and edge devices also need software to perform their tasks. Whilst a variety of different operating systems and application programs exist today, the prevailing trend is towards open systems.</strong></p>
<p>Embedded and edge devices are simply small computers. And like computers, they need software to be able to perform their tasks. Just like an embedded software. An operating system is the starting point in this respect, just like with the PC. However, because an embedded-computer performs precisely defined tasks, its workload is nowhere near as high as that of a PC.</p>
<p>Which means that very simple <a href="https://future-markets-magazine.com/en/encyclopedia/embedded-system/" target="_blank" title="Hardware and software components integrated into a unified system to implement system-specific functional features." class="encyclopedia">embedded system</a>s can even manage without an operating system. Where an embedded computer does need an operating system though is when multiple applications are added. As is the case with Edge Computing. The operating system is responsible for memory and file management. And also for ensuring that various tasks can be processed in parallel and for handling integration in higher-level systems. Based on various communication standards. Special embedded versions of standard operating systems are frequently used here. For example Windows 10 <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> or Linux-based systems such as Ubuntu.</p>
<h2><strong>Linux with real-time capability</strong></h2>
<p>Linux has established itself as one of the most powerful embedded operating systems. Thanks to the large number of <a href="https://future-markets-magazine.com/en/encyclopedia/cpu/" target="_blank" title="Central Processing Unit" class="encyclopedia">CPU</a>-architectures supported, the practically unlimited number of drivers and its excellent portability and scalability. According to the <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> Developer Survey 2018 conducted by the Eclipse Foundation, around 72 per cent of developers opt for Linux systems for their <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices, whilst only approx. 23 per cent use Windows.</p>
<div class="su-note" style="border-color:#007072;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-clearfix" style="background-color:#007D7F;border-color:#cce5e5;color:#ffffff;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;">Debian is the leading Linux system. Around one third of all Linux users who took part in the Eclipse survey use Debian and Debian derivatives. Such as Raspbian, Ubuntu / Ubuntu Core.</div></div>
<p>Meanwhile, Linux-based systems that fulfil the requirements for hard real-time use are starting to emerge. Until now, special real-time extensions, such as RT patches or PREEMPT_RT, had to be used for this purpose. Many of the techniques developed in this context are already included in the Linux kernel, therefore in the basic system. However, the expectation is that the real-time extensions will be incorporated fully in the kernel in the foreseeable future.</p>
<p>The current Linux version 5.3 can already be viewed as a trailblazer in terms of real-time support. This is because a new kernel configuration option has been added to the main development branch of the kernel. Which will allow a fully preemptible kernel (real-time/RT) to be built &ndash; with support for real-time requirements.</p>
<h2><strong>Faster to market with edge frameworks</strong></h2>
<p>The application software runs on the operating system. It contains the device-specific functions, processes the recorded data and represents the interface to the higher-level system. Programming an application is complex from the ground up. However, since certain functions and elements are the same in every application software, so-called frameworks were developed. These are not independent programs, rather provide a structure for programmers in the sense of a programming framework.</p>
<p>Since these functions and elements do not have to be reprogrammed every time, the development time is reduced significantly. Examples of frameworks that have become established in Edge Computing include AWS Greengrass, Microsoft Azure <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> Edge or Alibaba <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">Cloud</a>. Choosing the right framework is not a simple task. However, since it generally requires specific hardware (e.g. x86 architecture or AMD chips), only runs on a defined operating system. And is tailored to special fields of application.</p>
<h2><strong>Open systems unify the market</strong></h2>
<p>Since Edge Computing can at times involve an entire eco-system of different devices from different manufacturers communicating with one another. The aim should be to avoid interfaces between various operating systems and application programs where possible. This is what is promised by open systems. Open in this respect, however, open does not (only) mean that the software is developed as an open source project. This is already the case with Azure <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> Edge, for example. In essence, open means that the application software is independent of hardware and operating system. And additionally, the software used in the <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a>.</p>
<p>This is precisely what is being aimed for with LF Edge. Which is being used within the Linux Foundation to create an open and interoperable framework for Edge Computing. And the framework is independent of hardware, silicon, <a href="https://future-markets-magazine.com/en/encyclopedia/cloud/" target="_blank" title="Provision of IT resources over the Internet on demand, billed according to actual usage." class="encyclopedia">cloud</a> or operating system.</p>
<p>LF Edge was launched at the beginning of 2019 and initially consists of five projects. Including Akraino Edge Stack, EdgeX Foundry, Home Edge, Open Glossary of Edge Computing and Project EVE, some of which were already started previously. The open framework should provide support for newly emerging edge applications and connected things, which require a lower latency, faster data processing and mobile use. Based on the software stack, LF Edge should help to unify the fragmented Edge-Computing market by creating a shared open vision for the future of the sector.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/embedded-software-and-its-colourful-world/">Embedded Software and its colourful world</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Smart cameras keep everything squarely in their sights</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/smart-cameras-keep-everything-squarely-in-their-sights/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Sun, 27 Dec 2020 07:00:34 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[automated driving camera]]></category>
		<category><![CDATA[automatisiertes Fahren Kamera]]></category>
		<category><![CDATA[Echtzeitfähige Steuerung]]></category>
		<category><![CDATA[intelligent camera]]></category>
		<category><![CDATA[intelligent systems]]></category>
		<category><![CDATA[intelligente Kamera]]></category>
		<category><![CDATA[intelligente Systeme]]></category>
		<category><![CDATA[Netatmo Außenkamera]]></category>
		<category><![CDATA[netatmo outdoor camera]]></category>
		<category><![CDATA[Netatmo Welcome]]></category>
		<category><![CDATA[Presence Netatmo]]></category>
		<category><![CDATA[Qualitätskontrolle]]></category>
		<category><![CDATA[quality control]]></category>
		<category><![CDATA[real time control]]></category>
		<category><![CDATA[smart cameras]]></category>
		<category><![CDATA[Smarte Kameras]]></category>
		<category><![CDATA[surveillance camera]]></category>
		<category><![CDATA[Überwachungskamera]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9140</guid>

					<description><![CDATA[<p>Smart cameras can process imagery autonomously before initiating responses of their own accord. In industrial&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/smart-cameras-keep-everything-squarely-in-their-sights/">Smart cameras keep everything squarely in their sights</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Smart cameras can process imagery autonomously before initiating responses of their own accord. In industrial applications, they are used for quality control or to control plants and systems. As surveillance cameras, they analyse videos themselves and only alert personnel in dangerous situations. </strong></p>
<p>Whether used for quality control in industry, building surveillance, or even for automated driving. The demand for around-the-clock, detailed video recording is growing. Its generating large data volumes that necessitate high transmission rates and a great deal of storage.</p>
<h2><strong>Smart cameras and image processing combined</strong></h2>
<p>Smart cameras are becoming more popular for all the reasons mentioned above. These smart systems combine cameras and high-performance image processing in a single unit. With the aid of these edge devices, image pre-processing or even full video analysis is performed in the camera itself. This relieves the host system. As such, additional PCs or controllers are not required. Depending on the performance of the integrated processor or <a href="https://future-markets-magazine.com/en/encyclopedia/fpga/" target="_blank" title="Field Programmable Gate Array" class="encyclopedia">FPGA</a>, such cameras are able to identify colours and shapes. And then compare these to specified structures. They even detect human movement and behaviour. In the course of such tasks, methods from the fields of <a href="https://future-markets-magazine.com/en/encyclopedia/machine-learning/" target="_blank" title="Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them&hellip;" class="encyclopedia">machine learning</a> and AI are also employed.</p>
<h2><strong>Controlling plants and systems in real time</strong></h2>
<p>In industrial applications, this enables smart cameras to make decentralised decisions and react appropriately to defective parts, for example. This might involve activating an ejector device via PLC or notifying a human colleague who is monitoring the production process. Smart cameras are being deployed more frequently for typical real-time tasks, such as in pick-and-place applications. These smart cameras with powerful processors are increasingly sidelining the PC-based solutions that were previously the norm. With networked plant and robot control, embedded image-processing systems of this kind can provide comprehensive support for complex manufacturing steps.</p>
<p>To give one example, the company Wente/Thiedig is specialises in industrial image-processing solutions. It developed the SKG500 detection system for a client from the automotive sector. It determines the situation and position of heavy components in boxes to enable a robot arm to grip them precisely. Two smart camera systems of type VCSBC6211nano-RH manufactured by Vision Components act as independent image-processing systems.</p>
<div class="su-note" style="border-color:#007072;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-clearfix" style="background-color:#007D7F;border-color:#cce5e5;color:#ffffff;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;">The smart VCSBC6211nano-RH board camera got a computing power of 5,600 MIPS and compact dimensions (40 mm x 60 mm). So it is particularly suitable for applications where very little installation space is available. The camera features a processor with a clock rate of 700 MHz, a 32-MB flash EPROM and SDRAM memory of 128 MB. Live imagery can be output via the 100-Mbit Ethernet interface, which supports free programming.</div></div>
<p>They optically assess the contours of the uppermost blank and determine its position, lateral situation and orientation before picking. The information gathered from the 3D image is then sent via Ethernet to the plant controller by the smart camera. The fact is that the 3D detection system takes the form of an edge solution facilitates a reduced cycle time. Which in turn guarantees optimum utilisation of the subsequent processing centres.</p>
<h2><strong>Human, animal or vehicle?</strong></h2>
<p>Alongside industrial image processing, security is the second major application area for smart cameras. With integrated intelligence, modern surveillance cameras can autonomously interpret the recorded video data. In the event of potential security risks, they alert security personnel or the apartment owner in real time. Using preconfigured alarm rules. Smart-video analysis automatically detects rule violations, such as when a person enters a blocked-off area within the recorded image range.</p>
<p>The smart indoor camera from Netatmo can differentiate whether a car, a dog or a person has entered the area. It analyses the recorded images in real time and informs the homeowner accordingly via a smartphone app.</p>
<p>The indoor counterpart from Netatmo, the smart outdoor camera, can even recognise faces once they have been &ldquo;trained&rdquo;. However, enabling it to identify the person entering the home and pass this information on to the respective resident. In addition, the system can differentiate between a pet&rsquo;s movements and other activities thanks to artificial intelligence. This helps to avoid alarms being triggered by pets. The user is only informed when something significant happens.</p>
<h2><strong>Racing drivers under surveillance</strong></h2>
<p>Surveillance cameras for professional applications are even more capable. Network cameras from Bosch feature 17 different video-analysis <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s, for instance. This enables them to differentiate between actual safety or security incidents. Integrated video analysis enables large volumes of recorded video material to be quickly scanned for critical information. As only relevant images are transmitted, the network load and required memory are considerably reduced.</p>
<p>One rather atypical application scenario for these cameras can be found at the Misano World Circuit, one of the most famous MotoGP circuits in the world.</p>
<div class="su-note" style="border-color:#007072;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-clearfix" style="background-color:#007D7F;border-color:#cce5e5;color:#ffffff;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;">Since opening in 1972, the Misano World Circuit in northern Italy has been one of the most famous motorcycle racing circuits in the world. The circuit hosts several prestigious races, including the San Marino and Rimini&rsquo;s Coast Motorcycle Grand Prix. Every year, more than 600,000 MotoGP fans visit the arena. As such, it turns over more than EUR 62 million each year.</div></div>
<p>Here, the safety of the participating motorcyclists is of paramount importance to the circuit&rsquo;s operators. Rule infringements on the part of the drivers must also be promptly detected. According to the rules of the Grand Prix, these are punished by points deductions. Yet maintaining a clear overview throughout 27 laps of a MotoGP race is no mean feat. With around 25 motorcyclists and speeds of more than 300 km/h, monitoring the circuit represents a real challenge.</p>
<p>For this reason, Bosch installed Autodome IP starlight 7000 HD cameras. Their video-analysis function was custom-configured for operations in Misano. The cameras automatically track race participants, enabling their safety and adherence to the rules to be monitored at any time. If an accident occurs, the cameras automatically inform the control centre of this. In this way, the situation on the racing circuit can be quickly analysed, while swift reactions are ensured.</p>
<p>&nbsp;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/smart-cameras-keep-everything-squarely-in-their-sights/">Smart cameras keep everything squarely in their sights</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Edge solutions: Consumer sector</title>
		<link>https://future-markets-magazine.com/en/markets-technology-en/edge-solutions-in-the-consumer-sector/</link>
		
		<dc:creator><![CDATA[The Quintessence]]></dc:creator>
		<pubDate>Fri, 25 Dec 2020 07:00:11 +0000</pubDate>
				<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[Markets & Technology]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[Bankensektor]]></category>
		<category><![CDATA[banking sector]]></category>
		<category><![CDATA[consumer devices]]></category>
		<category><![CDATA[Consumer-Geräte]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Edge Technologien]]></category>
		<category><![CDATA[edge technologies]]></category>
		<category><![CDATA[Embedded Technologien]]></category>
		<category><![CDATA[embedded technologies]]></category>
		<category><![CDATA[financial sector]]></category>
		<category><![CDATA[Finanzsektor]]></category>
		<category><![CDATA[künstliche Intelligenz]]></category>
		<category><![CDATA[neviscura]]></category>
		<category><![CDATA[nevisQ]]></category>
		<category><![CDATA[safe payment]]></category>
		<category><![CDATA[Sicheres Bezahlen]]></category>
		<guid isPermaLink="false">https://future-markets-magazine.com/?p=9158</guid>

					<description><![CDATA[<p>Edge solutions in the consumer sector are possible thanks to increasingly powerful embedded technologies. As&#8230;</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/edge-solutions-in-the-consumer-sector/">Edge solutions: Consumer sector</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Edge solutions in the consumer sector are possible thanks to increasingly powerful embedded technologies. As a result, they make it possible to equip even everyday devices such as stoves or televisions with edge intelligence. This not only means greater ease of use but also leads to improved operation of the devices. Smart applications with local data processing can also provide added security in the home, for example. Or allow simple payments using a smartphone.</strong></p>
<p>Smart devices with their own intelligence are increasingly also conquering the everyday world of the consumer, be it in the form of <a href="https://future-markets-magazine.com/en/encyclopedia/wearables/" target="_blank" title="Miniature electronic systems embedded into everyday objects which can be worn on &ndash; or even&hellip;" class="encyclopedia">wearables</a>, in home appliances or with assistance systems that make life simpler and safer for the elderly. There are already some edge solutions in the consumer sector.</p>
<p>Miele has launched a solution on the market called Motionreact, for example. Which the oven uses to anticipate what the user wants to do next. For example, the oven alerts you to the end of the program by emitting an acoustic signal. As the user approaches the device, two things happen at the same time. The acoustic signal falls silent and the light in the oven chamber switches on. Or, the appliance and oven chamber light switch on when approached and the main menu appears on the display. From a technical perspective, the system operates via infrared sensors in the appliance panel. They respond to movements at a distance of between approx. 20 and 40 centimetres in front of the appliance.</p>
<h2><strong>Edge solutions in the consumer sector</strong></h2>
<p>Artificial Intelligence capabilities are being integrated increasingly into consumer devices. Not only do they allow greater ease of use, operability is also improved.</p>
<p>An example of this is the new generation of TV top models from LG Electronics. They have intelligent processors which, thanks to the integrated AI, improve the visual quality. Using deep-learning <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s, the TVs analyse the quality of the signal source. And accordingly choose the most suitable interpolation method for optimum picture playback. Additionally, the processor performs a dynamic fine calibration of the tone-mapping curve for HDR contents in accordance with ambient light. The picture brightness is optimised dynamically based on values for how the human eye perceives images under different lighting conditions. Even in the darkest scenes, high-contrast and detailed pictures with excellent colour depth can be reproduced. And even in rooms with high ambient brightness. The room brightness is captured by means of an ambient light sensor in the TV.</p>
<p>Smart devices are also of tremendous assistance when it comes to making life safer and more comfortable for the elderly. This is shown by the example neviscura from nevisQ. The discreet sensor system integrated in the apartment&rsquo;s skirting boards allows falls, for example. To be detected automatically and without any additional equipment attached to the body. The nurse-call system then informs the nurse in real time.</p>
<p>The data from the infrared sensors is recorded in a base station with smart functions and analysed. Local data processing immediately detects critical situations in the room. With the base station acting as an interface to the call system at the same time. In the future, the AI-sensor system will also use activity analyses to detect whether a person&rsquo;s condition is changing. And thus prevent critical situations.</p>
<h2><strong>Banking sector &ndash; one of the largest users of edge technologies</strong></h2>
<p>The <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> with its array of consumer <a href="https://future-markets-magazine.com/en/encyclopedia/wearables/" target="_blank" title="Miniature electronic systems embedded into everyday objects which can be worn on &ndash; or even&hellip;" class="encyclopedia">wearables</a> along with Edge Computing are also altering life outside of the home. This is especially true of how banks conduct their business.</p>
<p>According to a study by ResearchAndMarkets, the financial and banking sector worldwide is actually one of the biggest users of Edge Computing. Growing acceptance of digital and mobile banking initiatives and payment using <a href="https://future-markets-magazine.com/en/encyclopedia/wearables/" target="_blank" title="Miniature electronic systems embedded into everyday objects which can be worn on &ndash; or even&hellip;" class="encyclopedia">wearables</a> are significantly increasing the demand for Edge-Computing solutions. Such <a href="https://future-markets-magazine.com/en/encyclopedia/wearables/" target="_blank" title="Miniature electronic systems embedded into everyday objects which can be worn on &ndash; or even&hellip;" class="encyclopedia">wearables</a> are, for example, the Apple Watch, Fitbit or the smartphone. That&rsquo;s because banking networks need to be as secure and reliable as possible in order to gain from the advantages offered by <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> technologies. But <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices themselves are difficult to secure. To achieve the security level required for banking applications, advanced cryptographic <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s are needed. However, these <a href="https://future-markets-magazine.com/en/encyclopedia/cpu/" target="_blank" title="Central Processing Unit" class="encyclopedia">CPU</a>-intensive operations are extremely complex to implement, for <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices.</p>
<p>The use of security agents at the edge is therefore recommended for this reason. For example, this could be a <a href="https://future-markets-magazine.com/en/encyclopedia/router/" target="_blank" title="Device interconnecting multiple computer networks." class="encyclopedia">router</a> or a base station installed in the vicinity of the user for processing security <a href="https://future-markets-magazine.com/en/encyclopedia/algorithm/" target="_blank" title="A generally interpretable unique description of a sequence of actions to resolve a &ndash; usually&hellip;" class="encyclopedia">algorithm</a>s and encrypting data from and to <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices. Customers can therefore complete banking transactions securely even with simpler <a href="https://future-markets-magazine.com/en/encyclopedia/wearables/" target="_blank" title="Miniature electronic systems embedded into everyday objects which can be worn on &ndash; or even&hellip;" class="encyclopedia">wearables</a>. But as use of <a href="https://future-markets-magazine.com/en/encyclopedia/iot/" target="_blank" title="Internet of Things" class="encyclopedia">IoT</a> devices for processing banking transactions and payments continues to grow, so too does the need to store and process data in edge data centres. Because they are closer to the user, these micro data centres allow data to be processed closer to the source, thus reducing the response times of the system (latency) and also the costs for data transfer. Paying via smartphone is therefore just as fast as or even faster than using cash from a wallet.</p>
<p>The post <a href="https://future-markets-magazine.com/en/markets-technology-en/edge-solutions-in-the-consumer-sector/">Edge solutions: Consumer sector</a> appeared first on <a href="https://future-markets-magazine.com/en/">Future Markets Magazine</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
