Page Properties | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||
Resources & Remarks Modification History A command line app for management and maintenance of the Machine Learning (ML) Pipeline of the Auto ML Platform
| ||||||||||||||
Excerpt | ||||||||||||||
|
Excerpt |
---|
This service of the AI platform provides the API for the configuration of AI platform, such as the schema that binds the ML-extractors to fields in object schema. |
A Swagger UI is available via https://<host>/kairos-api/swagger-ui.html
.
Section | ||||||
---|---|---|---|---|---|---|
| ||||||
|
Characteristics
Note | ||
---|---|---|
| ||
The app is a component of the Auto ML Platform. This platform is not included in yuuvis® Momentum installations and is available as a beta version only on request. |
Function
The management of the Machine Learning (ML) Pipeline is done via the command line application Kairos CLI. The following commands are available.
...
Login to Kairos CLI app to get further access to commands. The login requires TENANT, USERNAME and PASSWORD in any order. If you don't specify values for all options, a dialog will be started.
You will stay logged in until you enter the kairos
logout
command. Even if you close the terminal and enter again, you are still logged in and will be able to run all commands.
...
Adds a new pipeline to the list. Requires a NAME for the new pipeline and a path to the new pipeline's ARTIFACT LOCATION.
...
Adds a new experiment to the list. Requires a NAME for the new experiment and a path to the new experiment's ARTIFACT LOCATION.
...
Removes an experiment and its configurations. Requires the ID of the experiment (NOT the pipeline!) you want to delete.
...
Returns console output with list of all available registered models.
Optionally, a MODEL NAME can be specified via the option -n
. The output will be a list of all available versions of the corresponding model.
In addition to the option -n
, the option -v
can be used to specify a MODEL VERSION. The output will be a list containing specific model version information.
...
Returns console output of specified inference schema.
Optionally, an APP NAME can be specified via the option -appName
in order to get the inference schema of the indicated app.
...
Uploads and overwrites existing inference schema to config api. Requires the FILE PATH to the inference schema to be uploaded.
Optionally, an APP NAME can be specified via the option -appName
in order to upload the inference schema for the indicated app.
...
Downloads inference schema to specified location. Requires a RETURN PATH where the inference schema will be saved.
Optionally, an APP NAME can be specified via the option -appName
in order to get the inference schema of the indicated app.
Requirements
The Kairos CLI app is a part of the Auto ML platform and can run only in combination with the other included components.
Kairos CLI furthermore requires:
- java 13+
The current beta version of Kairos CLI can be operated only on Windows systems.
Installation and Configuration
Use the terminal in administrator mode to have access to all system directories.
- If you were already using Kairos CLI app before, uninstall the old version.
- Run the installation file for the current version of Kairos CLI.
- Finish the installation.
The path of the Kairos CLI installation directory. Set this path as value for your system environment path variable.
Now, configure the custom properties for the Kairos CLI app:
...
Service Name | kairos-api |
---|---|
Public API | KAIROS-API Endpoints |
Function
The KAIROS-API service is responsible for configuring the inference schema, that is in turn used by PREDICT-API to call the appropriate machine learning models for an object type.
The endpoints of the KAIROS-API service are provided in an own API that shall be called to configure the system by system operators.
Requirements
The Kairos-API service is a part of the Artificial Intelligence Platform and can run only in combination with the other included services.
Further requirements:
>> AI Platform Requirements
Configuration
The Inference Schema needs to be defined according to your client application.
Info | |||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||||||||
Read on
|