How it works

BonsAPPs will allow users to develop AI solutions to solve real-world industry challenges at the deep edge that requires a number of AI artifacts and services produced by different stakeholders. AI developers and platform providers will be accessible on the BonsAPPs services layer to respond to these challenges, leveraging on the Bonseyes AI Marketplace to deliver end-to-end, containerized, ready-to-integrate and re-usable solutions that will also be integrated and interoperable with the AI4EU platform.

Role of different actors in AI Value Chain

A fully automated AI procurement process

AI-App pipeline: integrating framework to bring data, algorithms, and deployment tools together.

INNOVATOR

Defines challenge

Group 5

CHALLENGE
WORKFLOW

CHALLENGE

DATA SCIENTIST

Solves challenge

data

TRAINING
TOOLS

AI ASSET

BONSEYES AI MARKETPLACE

DEVELOPER

Deploys AI APP

developer

DEPLOYMENT
TOOLS

AI APP

INTEGRATOR

Integrates AI APP

integrator

BENCHMARK
TOOLS

AI SOLUTION

BONSEYES AI MARKETPLACE

INNOVATOR

Defines challenge

Group 5

CHALLENGE
WORKFLOW

CHALLENGE

DATA SCIENTIST

Solves challenge

data

TRAINING
TOOLS

AI ASSET

DEVELOPER

Deploys AI APP

developer

DEPLOYMENT
TOOLS

AI APP

INTEGRATOR

Integrates AI APP

integrator

BENCHMARK
TOOLS

AI SOLUTION

BONSEYES AI MARKETPLACE

BONSEYES AI MARKETPLACE

INNOVATOR

Defines challenge

Group 5

CHALLENGE WORKFLOW

CHALLENGE

DATA SCIENTIST

Solves challenge

data

TRAINING
TOOLS

AI ASSET

DEVELOPER

Deploys AI APP

developer

DEPLOYMENT
TOOLS

AI APP

INTEGRATOR

Integrates AI APP

integrator

BENCHMARK
TOOLS

AI SOLUTION

BONSEYES AI MARKETPLACE

AI SERVICE LAYER

A Scalable Services Layer

For Real-World User-Focused AI Solutions

The BonsAPPs services will be accessible on the Bonseyes Marketplace and interoperable with the AI4EU community to respond to AI Challenges fitting the end users needs.

Bonseyes Functionalities

Challenges

AI Assets

AI Apps

AI Solutions

Developer Platforms

Services

Experimentation with Ai apps, platforms & PoCs

Security & licensing

Model optimization

Deployment on platforms & Benchmarking

Infrastructure

Services
Validators/Users

Challenges

AI Assets

AI Apps

AI Solutions

Developer Platforms

Edge Intelligence enablement requires a service layer

Fully integrated training and deployment workflows for AI Apps

We propose an end-to-end AI pipeline to develop and deploy Deep Neural Network solutions on embedded devices. We propose a modular AI pipeline architecture which contains four modular main tasks:

DATA

Collection

Training

Model compression

Hardware-in-the-Loop Deployment

Model optimization, deployment and brenchmarking

Solution integration

Model verification

Data

Collection and labeling of data. Data augmentation and export of training and benchmarking datasets

Training

Model training and compression

Infrastructure

Optimization

Benchmarking

Code optimization on target hardware

Deployment

Test Protocol

Protocol and testing results

Test Bias / Sensitivity Analysis

Model analysis for bias and sensitivity to noise

Integration

AI App integrated into system

Verification

Correct performance is verified

Evaluation Report Feedback for next KPI target

Verification Report

Enabling shift from Cloud to the Deep Edge

Transition from Cloud Centric to the Deep Edge Centric Model for End-Users

Optimization tools and infrastructure required for deployment of AI to resource-constrained devices:

Reduction in development time compared to monolithic system design methods

Reduction in cost of ownership related to training of deep learning models comparing to current training approaches designed for the cloud

Re-usability of AI assets/apps on the marketplace

Reduction in computational and memory requirement with current existing solutions for deep learning models on embedded systems

Scroll to Top