Scope of this document

AI Inference Server Release Notes

Product
AI Inference Server
Product Version
2.1.0
Language
en-US

This document applies to the following applications:

MLFB Number Application description
MLFB 6AV2170-0LA10-0AA0 AI Inference Server with support for single pipeline version
MLFB 6AV2170-0LA10-1AA0 AI Inference Server with support for multiple pipeline version
MLFB 6AV2170-0LA11-0AA0 AI Inference Server with support for GPU accelerated hardware, single pipeline version

Each chapter is valid for both applications except for those statements where the MLFB numbers are explicitly stated.

Notice: Only one AI Inference Server application instance (from the above list) can be installed on an edge device. It means you can install MLFB 6AV2170-0LA10-0AA0 or MLFB 6AV2170-0LA10-1AA0 or MLFB 6AV2170-0LA11-0AA0 on a single edge device. You cannot install e.g. MLFB 6AV2170-0LA11-0AA0 and MLFB 6AV2170-0LA10-1AA0 applications on a single device.

Notice: It's the Customer's responsibility to verify end-to-end functionality with PLC data writeback before putting it to production environment.

CAUTION: AI Inference Server should not be used in mission-critical scenarios with high risks (i.e. development, construction, maintenance or operation of systems, the failure of which could lead to a life-threatening situation or to catastrophic damage ("critical application")). Examples of critical applications: Use in avionics, navigation, autonomous vehicle applications, AI solutions for automotive products, the military, medicine, life support or other life-critical applications.