Inference News

FIRST AI INTEGRATED MINI-ITX SYSTEM TO SIMPLIFY EDGE AND EMBEDDED AI DEPLOYMENT

FIRST AI INTEGRATED MINI-ITX SYSTEM TO SIMPLIFY EDGE AND EMBEDDED AI DEPLOYMENT

InferX Hawk AI system reduces time to market, risk and costs with an AI Mini-ITX x86 system for new edge AI appliances and a drop-in upgrade for existing solutions

Using AI to Speed Up Edge Computing

Using AI to Speed Up Edge Computing

AI is being designed into a growing number of chips and systems at the edge, where it is being used to speed up the processing of massive amounts of data, and to reduce power by partitioning and prioritization. That, in turn, allows systems to act upon that data more rapidly.

Speeding Up AI Algorithms

Speeding Up AI Algorithms

AI at the edge is very different than AI in the cloud. Salvador Alvarez, solution architect director at Flex Logix, talks about why a specialized inferencing chip with built-in programmability is more efficient and scalable than a general-purpose processor, why high-performance models are essential for getting accurate real-time results, and how low power and ambient temperatures can affect the performance and life expectancy of these devices.

Inference Events

Big Data AI Toronto

Big Data AI Toronto

Canada's #1 Conference and Expo. A unique 4-in-1 tech event experience Big Data & AI Toronto 2022 is a combinations of four emerging technologies: Big Data, AI, Cloud and Cybersecurity.

Edge Computing World

Edge Computing World

Edge Computing World will demonstrate the early successes of Edge Computing and the applications that leverage this new paradigm and data platforms.