April 28, 2025
Trending News

Databricks announces Model Serving: Serverless Inference Service via API

  • March 7, 2023
  • 0

Databricks launches Databricks Model Serving, a serverless, real-time machine learning model inference solution powered by a REST API. Databricks introduces a real-time inference service with Databricks Model Serving.

Databricks announces Model Serving: Serverless Inference Service via API

data bricks

Databricks launches Databricks Model Serving, a serverless, real-time machine learning model inference solution powered by a REST API.

Databricks introduces a real-time inference service with Databricks Model Serving. The solution enables inferencing through proprietary machine learning models without the need to configure infrastructure as Databrick’s model serving is serverless. In addition, you use the solution via a REST API. Databricks hopes to simplify the workflows of many organizations.

Inference via API call

Organizations that have trained a model only need to prepare it for inference. Then other applications can communicate with Databricks Model Serving via an API when effective inferences are required. Data from that application is then displayed by the model, which draws a conclusion based on the data, recommends a product, or returns some other form of ML-based insights. Since the solution works serverless, Databricks ensures that there is always enough computing power available to process the requested inference volume.

The solution has some limitations. For example, it only works with machine learning models built with MLFlow, although it doesn’t matter what’s underneath the MLFlow wrapper. The service integrates with the Databricks Lakehouse platform and works seamlessly with other services running on that platform. The cost of the service depends on the desired capacity.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *