fbpx
Connect with us

Tech

Realizing IoT’s potential with AI and machine learning

Published

on

Realizing IoT’s potential with AI and machine learning

The Transform Technology Summits start October 13th with Low-Code/No Code: Enabling Enterprise Agility. Register now!


The key to getting more value from industrial internet of things (IIoT) and IoT platforms is getting AI and machine learning (ML) workloads right. Despite the massive amount of IoT data captured, organizations are falling short of their enterprise performance management goals because AI and ML aren’t scaling for the real-time challenges organizations face. If you solve the challenge of AI and ML workload scaling right from the start, IIoT and IoT platforms can deliver on the promise of improving operational performance.

Overcoming IoT’s growth challenges

More organizations are pursuing edge AI-based initiatives to turn IoT’s real-time production and process monitoring data into results faster. Enterprises adopting IIoT and IoT are dealing with the challenges of moving the massive amount of integrated data to a datacenter or centralized cloud platform for analysis and derive recommendations using AI and ML models. The combination of higher costs for expanded datacenter or cloud storage, bandwidth limitations, and increased privacy requirements are making edge AI-based implementations one of the most common strategies for overcoming IoT’s growth challenges.

In order to use IIoT and IoT to improve operational performance, enterprises must face the following challenges:

  • IIoT and IoT endpoint devices need to progress beyond real-time monitoring to provide contextual intelligence as part of a network. The bottom line is that edge AI-based IIoT / IoT networks will be the de facto standard in industries that rely on supply chain visibility, velocity, and inventory turns within three years or less. Based on discussions VentureBeat has had with CIOs and IT leaders across financial services, logistics, and manufacturing, edge AI is the cornerstone of their IoT and IIoT deployment plans. Enterprise IT and operations teams want more contextually intelligent endpoints to improve end-to-end visibility across real-time IoT sensor-based networks. Build-out plans include having edge AI-based systems provide performance improvement recommendations in real time based on ML model outcomes.
  • AI and ML modeling must be core to an IIoT/IoT architecture, not an add-on. Attempting to bolt-on AI and ML modeling to any IIoT or IoT network delivers marginal results compared to when it’s designed into the core of the architecture. The goal is to support model processing in multiple stages of an IIoT/IoT architecture while reducing networking throughput and latency. Organizations that have accomplished this in their IIoT/IoT architectures say their endpoints are most secure. They can take a least-privileged access approach that’s part of their Zero Trust Security framework.
  • IIoT/IoT devices need to be adaptive enough in design to support algorithm upgrades. Propagating algorithms across an IIoT/IoT network to the device level is essential for an entire network to achieve and keep in real-time synchronization. However, updating IIoT/IoT devices with algorithms is problematic, especially for legacy devices and the networks supporting them. It’s essential to overcome this challenge in any IIoT/IoT network because algorithms are core to AI edge succeeding as a strategy. Across manufacturing floors globally today, there are millions of programmable logic controllers (PLCs) in use, supporting control algorithms and ladder logic. Statistical process control (SPC) logic embedded in IIoT devices provides real-time process and product data integral to quality management succeeding. IIoT is actively being adopted for machine maintenance and monitoring, given how accurate sensors are at detecting sounds, variations, and any variation in process performance of a given machine. Ultimately, the goal is to predict machine downtimes better and prolong the life of an asset. McKinsey’s study Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? found that IIoT-based data combined with AI and ML can increase machinery availability by more than 20%. The McKinsey study also found that inspection costs can be reduced by up to 25%, and annual maintenance costs reduced overall by up to 10%. The following graphic is from the study:

Using IIoT sensors to monitor stock and vibration of production equipment is a leading use case that combines real-time monitoring and ML algorithms to extend the useful life of machinery while ensuring maintenance schedules are accurate.

Above: Using IIoT sensors to monitor stock and vibration of production equipment is a leading use case that combines real-time monitoring and ML algorithms to extend the useful life of machinery while ensuring maintenance schedules are accurate.

  • IIoT/IoT platforms with a unique, differentiated market focus are gaining adoption the quickest. For a given IIoT/IoT platform to gain scale, each needs to specialize in a given vertical market and provide the applications and tools to measure, analyze, and run complex operations. An overhang of horizontally focused IoT platform providers rely on partners for the depth vertical markets require when the future of IIoT/IoT growth meets the nuanced needs of a specific market. It is a challenge for most IoT platform providers to accomplish greater market verticalization, as their platforms are built for broad, horizontal market needs. A notable exception is Honeywell Forge, with its deep expertise in buildings (commercial and retail), industrial manufacturing, life sciences, connected worker solutions, and enterprise performance management. Ivanti Wavelink’s acquisition of an IIoT platform from its technology and channel partner WIIO Group is more typical. The pace of such mergers, acquisitions, and joint ventures will increase in IIoT/IoT sensor technology, platforms, and systems, given the revenue gains and cost reductions companies are achieving across a broad spectrum of industries today.
  • Knowledge transfer must occur at scale. As workers retire while organizations abandon the traditional apprentice model, knowledge transfer becomes a strategic priority. The goal is to equip the latest generation of workers with mobile devices that are contextually intelligent enough to provide real-time data about current conditions while providing contextual intelligence and historical knowledge. Current and future maintenance workers who don’t have decades of experience and nuanced expertise in how to fix machinery will be able to rely on AI- and ML-based systems that index captured knowledge and can provide a response to their questions in seconds. Combining knowledge captured from retiring workers with AI and ML techniques to answer current and future workers’ questions is key. The goal is to contextualize the knowledge from workers who are retiring so workers on the front line can get the answers they need to operate, repair, and work on equipment and systems.

How IIoT/IoT data can drive performance gains

A full 90% of enterprise decision-makers believe IoT is critical to their success, according to Microsoft’s IoT Signals Edition 2 study. Microsoft’s survey also found that 79% of enterprises adopting IoT see AI as either a core or a secondary component of their strategy. Prescriptive maintenance, improving user experiences, and predictive maintenance are the top three reasons enterprises are integrating AI into their IIoT/IoT plans and strategies.

Microsoft's IoT Signals Edition 2 Study explores AI, Digital Twins, edge computing, and IIoT/IoT technology adoption in the enterprise.

Above: Microsoft’s IoT Signals Edition 2 Study explores AI, digital twins, edge computing, and IIoT/IoT technology adoption in the enterprise.

Based on an analysis of the use cases provided in the Microsoft IoT Signals Edition 2 study and conversations VentureBeat has had with manufacturing, supply chain, and logistics leaders, the following recommendations can improve IIOT/IoT performance:

  • Business cases that include revenue gains and cost reductions win most often. Manufacturing leaders looking to improve track-and-trace across their supply chains using IIoT discovered cost reduction estimates weren’t enough to convince their boards to invest. When the business case showed how greater insight accelerated inventory turns, improved cash flow, freed up working capital, or attracted new customers, funding for pilots wasn’t met with as much resistance as when cost reduction alone was proposed. The more IIoT/IoT networks deliver the data platform to support enterprise performance management real-time reporting and analysis, the more likely they would be approved.
  • Design IIoT/IoT architectures today for AI edge device expansion in the future. The future of IIoT/IoT networks will be dominated by endpoint devices capable of modifying algorithms while enforcing least privileged access. Sensors’ growing intelligence and real-time process monitoring improvements are making them a primary threat vector on networks. Designing in microsegmentation and enforcing least privileged access to the individual sensor is being achieved across smart manufacturing sites today.
  • Plan now for AI and ML models that can scale to accounting and finance from operations. The leader of a manufacturing IIoT project said that the ability to interpret what’s going on from a shop-floor perspective on financials in real time sold senior management and the board on the project. Knowing how trade-offs on suppliers, machinery selection, and crew assignments impact yield rates and productivity gains are key. A bonus is that everyone on the shop floor knows if they hit their numbers for the day or not. Making immediate trade-offs on product quality analysis helps alleviate variances in actual costing on every project, thanks to IIoT data.
  • Design in support of training ML models at the device algorithm level from the start. The more independent a given device can be from a contextual intelligence standpoint, including fine-tuning its ML models, the more valuable the insights it will provide. The goal is to know how and where to course-correct in a given process based on analyzing data in real time. Device-level algorithms are showing potential to provide data curation and contextualization today. Autonomous vehicles’ sensors are training ML models continually, using a wide spectrum of data including radar to interpret the road conditions, obstacles, and the presence or absence of a driver. The following graphic from McKinsey’s study Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? explains how these principles apply to autonomous vehicles.

Autonomous vehicles' reliance on a wide spectrum of data and ML models to interpret and provide prescriptive guidance resembles companies' challenges in keeping operations on track. 

Above: Autonomous vehicles’ reliance on a wide spectrum of data and ML models to interpret and provide prescriptive guidance resembles companies’ challenges in keeping operations on track.

Real-time IoT data holds the insights needed by digital transformation initiatives to succeed. However, legacy technical architectures and platforms limit IoT data’s value by not scaling to support AI and ML modeling environments, workloads, and applications at scale. As a result, organizations accumulating massive amounts of IoT data, especially manufacturers, need an IoT platform purpose-built to support new digital business models.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Go to Source

Click to comment

Leave a Reply

Tech

Leaked Alder Lake prices strike at Ryzen’s CPU dominance

Published

on

Leaked Alder Lake prices strike at Ryzen’s CPU dominance

Here’s what leaked retailer pricing tells us about the performance of Intel’s upcoming Alder Lake S CPUs.

6core vs 8core cpus

Intel / AMD / janniwet / Shutterstock

Today’s Best Tech Deals

Picked by PCWorld’s Editors

Top Deals On Great Products

Picked by Techconnect’s Editors

Intel’s 12th-gen Alder Lake processors aren’t upon us yet, but another price leak indicates they might indeed compete with AMD’s best CPUs, unlike current top-end Core offerings.

The latest oopsie comes from retail IT vendor Provantage, which puts the top-end Core i9-12900K at $605. The IT vendor also lists the Core i7-12700K at $420, as well as a Core i5-12600K for $283.

After news reports of the part numbers and prices surfaced, Provantage removed the listings. The latest leak follows reports two weeks ago—supposedly from European retailers—that placed the Core i9-12900K at $705, the Core i7-12700K at $495, and the Core i5-12600 at $343.

Before you jump to any conclusions, we want to point out that as reliable as a leaked retail price might seem, they can very unreliable too. Often times stores prep for impending launches by using placeholder prices and specs. Those listings are then updated when the stores receive the final information.

The leaked info itself from Provantage would indicate it’s not quite baked yet. For example, we know the top-end Alder Lake S chip will feature 8 performance cores and 8 efficient cores (Intel’s Alder Lake chips feature a radical new mixture of big and little cores), yet the listing at Provantage lists the top-end chip as an 8-core design. 

alder lake provantage Provantage via Hothardware.com

Hothardware.com snapped this image of Intel’s 12th gen Alder Lake CPUs at retailer Provantage. that has since been removed.

Still, both combined retail leaks reinforce what we’ve already come to conclude so far: Intel’s 12th-gen Alder Lake S will at least suit up with the intent to take on AMD’s 16-core Ryzen 9 5950X.

That’s a marked change from the $550 8-core 11th gen Rocket Lake CPU, which lost badly to AMD’s $550 12-core Ryzen 9 5900X chip. With the 11th-gen desktop chips, Intel didn’t even try to field a CPU against AMD’s $750 Ryzen 9 5950X.

With its increased core efficiency, newer manufacturing process, and physically more cores than previous Intel consumer desktop CPUs, it’s entirely possible Intel’s 12th Core i9 will actually end up being somewhere between $604 and $705 when it comes out.

intel alder lake performance core benchmark Intel

Intel is touting a marked increase in core efficiency with its 12th gen Alder Lake CPUs.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

One of founding fathers of hardcore tech reporting, Gordon has been covering PCs and components since 1998.

Go to Source

Continue Reading

Tech

The best Windows backup software

Published

on

The best Windows backup software

Updated

The best programs for keeping your data and Windows safely backed up.

Rob Schultz/IDG

Today’s Best Tech Deals

Picked by PCWorld’s Editors

Top Deals On Great Products

Picked by Techconnect’s Editors

Table of Contents

Show More

We need backup software for our PCs because our storage drives won’t last forever. Backup software ensures we’re covered when the day comes that our primary drive up and dies.

It would be nice if Microsoft itself provided Windows users with something like Apple’s Time Machine: an effective, set-it-and-forget-it, total system recovery and backup solution that requires little interaction or thought on the user’s part. 

Instead, Microsoft delivers a mishmash of restore points, recovery discs, file backup, and even the un-retired System Backup (Windows 7), which was probably originally put out to pasture for its propensity to choke on dissimilar hardware. Online backup services are another option, but desktop clients tend to offer far more flexibility. 

Plenty of vendors have stepped in with worthy alternatives, and while none are quite as slick or transparent as Time Machine, some come darn close—and many are free. Read on for our top picks. 

Updated on 9/15/21 to include our review of the newest version of Aomei Backupper 6. It remains our favorite free backup software for Windows because it provides a near-total backup solution, with a generous number of features. As a paid program, however, there are better options. Read more about it below. And scroll to the bottom of this article to see links to all our backup software reviews.

Best overall backup software

There’s a reason True Image is renowned in the world of backup software. It’s capable, flexible, and rock-solid reliable. Indeed, it’s easily the most comprehensive data safety package on the planet.

Besides offering unparalleled backup functionality that’s both robust and easy to navigate, True Image integrates security apps as well, which protect against malware, malicious websites, and other threats using real-time monitoring. Read our full review.

Best free backup software

Among the free programs we tested, Backupper Standard wins primarily because it has the most features, including imaging, file backup, disk cloning, and plain file syncing, plus multiple scheduling options (see our full review). This was the case with Backupper 4, and the latest version has only added more options, making it a surprisingly well-rounded free offering. We hit a few performance snags with less-conventional system setups, but for the average user, it should perform as expected.

What to look for in backup software

As with most things—don’t over-buy. Features you don’t need add complexity and may slow down your system. Additionally, if you intend to back up to a newly purchased external hard drive, check out the software that ships with it. Seagate, WD, and others provide backup utilities that are adequate for the average user.

File backup: If you want to back up only your data (operating systems and programs can be reinstalled, though it’s mildly time- and effort-consuming), a program that backs up just the files you select is a major time-saver. Some programs automatically select the appropriate files if you use the Windows library folders (Documents, Photos, Videos, etc.).

Image backup/Imaging: Images are byte-for-byte snapshots of your entire hard drive (normally without the empty sectors) or partition, and can be used to restore both the operating system and data. Imaging is the most convenient to restore in case of a system crash, and also ensures you don’t miss anything important.

Boot media:  Should your system crash completely, you need an alternate way to boot and run the recovery software. Any backup program should be able to create a bootable optical disc or USB thumb drive. Some will also create a restore partition on your hard drive, which can be used instead if the hard drive is still operational.

Scheduling: If you’re going to back up effectively, you need to do it on a regular basis. Any backup program worth its salt allows you to schedule backups.

Versioning: If you’re overwriting previous files, that’s not backup, it’s one-way syncing or mirroring. Any backup program you use should allow you to retain several previous backups, or with file backup, previous versions of the file. The better software will retain and cull older backups according to criteria you establish.

Optical support: Every backup program supports hard drives, but as obsolescent as they may seem, DVDs and Blu-Ray discs are great archive media. If you’re worried about optical media’s reliability, M-Disc claims its discs are reliable for a thousand years, claims that are backed up by Department of Defense testing.

Online support: An offsite copy of your data is a hedge against physical disasters such as flood, fire, and power surges. Online storage services are a great way to maintain an offsite copy of your data. Backup to Dropbox and the like is a nice feature to have.

FTP and SMB/AFP: Backing up to other computers or NAS boxes on your network or in remote locations (say, your parent’s house) is another way of physically safeguarding your data with an offsite, or at least physically discrete copy. FTP can be used for offsite, while SMB (Windows and most OS’s) and AFP (Apple) are good for other PCs or NAS on your local network.

Real time: Real-time backup means that files are backed up whenever they change, usually upon creation or save. It’s also called mirroring and is handy for keeping an immediately available copy of rapidly changing data sets. For less volatile data sets, the payoff doesn’t compensate for the drain on system resources. Instead, scheduling should be used.

Continuous backup: In this case, ‘continuous’ simply means backing up on a tight schedule, generally every 5 to 15 minutes, instead of every day or weekly. Use continuous backup for rapidly changing data sets where transfer rates are too slow, or computing power is too precious for real-time backup.

Performance: Most backups proceed in the background or during dead time, so performance isn’t a huge issue in the consumer space. However, if you’re backing up multiple machines or to multiple destinations, or dealing with very large data sets, speed is a consideration.

How we test

We run each program through the various types of backups it’s capable of. This is largely to test reliability and hardware compatibility, but we time two: an approximately 115GB system image (two partitions), and a roughly 50GB image created from a set of smaller files and folders. We then mount the images and test their integrity via the program’s restore functions. We also test the USB boot drives created by the programs.

All of our reviews

If you’d like to learn more about our top picks as well as other options, you can find links below to all of our backup software reviews. We’ll keep evaluating new programs and re-evaluating existing software on a regular basis, so be sure to check back for our current impressions.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

Jon is a Juilliard-trained musician, former x86/6800 programmer, and long-time (late 70s) computer enthusiast living in the San Francisco bay area. [email protected]

Go to Source

Continue Reading

Tech

Razer just made gamer thimbles

Published

on

Razer just made gamer thimbles

Or maybe they’re yoga pants for your thumbs?

Today’s Best Tech Deals

Picked by PCWorld’s Editors

Top Deals On Great Products

Picked by Techconnect’s Editors

Razer has never been afraid to take a shot on products that seem unusual at first glance. Witness its RGB-infused N95 mask, the now-defunct Razer Game Store with its own zVault currency, or the first-gen Firefly mousepad, which has evolved into something special but originally prompted us to review it against a ripped-up piece of cardboard. The company’s latest offering might just take the cake though. This week, Razer introduced gamer thimbles.

Yes, thimbles. You know, like the Monopoly piece (or the sewing accessory for more worldly folks out there). Seriously.

Well, not quite. If you simply can’t abide sweaty palms and greasy fingerprints interfering with your marathon mobile Fortnite sessions, the new Razer gaming finger sleeve may be up your alley. “Slip on and never slip up with Razer Gaming Finger Sleeve that will seal your mobile victory,” Razer’s site breathlessly boasts.  “Woven with high-sensitivity silver fiber for enhanced aim and control, our breathable sleeves keep your fingers deadly cool in the heat of battle, so you’ll always have a grip on the game.”

Razer says the 0.8mm-thick sleeves are sweat absorbent, and that they’re made from nylon and spandex. So maybe they’re more like gamer yoga pants? But you know, for your fingers?

Either way it’s ludicrous. And unlike most of Razer’s gear, the gamer thimbles understandably (yet sadly) lack RGB lighting. But if you want to wear your dedication to the Cult of Razer on your slee…thumb, or maybe just look snazzier when you’re passing Go and collecting $200, you can pick up a pair of Razer gaming finger sleeves on the company’s website for $10. The truly dedicated can double down to look especially gamer:

razer gamer thimbles 2 Razer

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

Go to Source

Continue Reading
Home | Latest News | Tech | Realizing IoT’s potential with AI and machine learning

Market

Trending