copied

Use Cases

Developers today are moving to serverless computing paradigm as a way to increase velocity. Here are some use cases for serverless, although any use case can be developed using this paradigm. With Nimbella, you can also develop use cases that are not traditionally developed using other serverless frameworks because they either require state to be shared between functions or they are long running. Nimbella natively supports and manages state automatically so a developer does not have to manage it themselves. Nimbella also supports long running functions.

Web Application

Web Application 
Requests from web users are forwarded to Nimbella functions for processing. Nimbella functions then update the processed content in the database. If the state is shared between different Nimbella function compute instances, it is automatically managed by Nimbella functions memory.

Real-Time Data

Real-Time Data Processing
Nimbella functions can be triggered from pub/sub topics or from event logs, giving an elastic, scalable event pipeline without the maintenance of complicated clusters. These event streaming pipelines can power the analytics systems, update secondary data stores and caches, or feed monitoring systems. Nimbella can process from any event stream such as Kafka, RabbitMQ and others.

IoT (Internet of Things)

IoT The IoT Gateway pushes IoT device status to Nimbella functions for processing. Nimbella offers a way to efficiently and transparently transmit the processed data to other products, such as writing to a database or pushing data to desktop or mobile clients.

AI/ML (Artificial Intelligence/Machine Learning)

AI/ML This is not a typical serverless use case because it requires state to be shared between functions. Today serverless workflows typically compose APIs, functions and events to automate processes but in AI/ML use cases, the automation is more complicated. Instead of just triggering events in a serverless workflow, with Nimbella functions it is possible for workflows to also refer back to previous instances of running that workflow and see when the optimal outcome was achieved.