Serverless vs. Kubernetes: Is there a right path?
.png)
This article is my summary of an insightful panel discussion on the topic of Serverless versus Kubernetes. We will delve into the valuable perspectives shared by esteemed industry professionals during the panel. The discussion aims to shed light on the relationship between serverless architectures and Kubernetes, focusing on the logic behind utilizing Kubernetes for stateless applications. Through an exploration of different viewpoints, we aim to gain a better understanding of the merits and considerations surrounding these two approaches. You can watch the full video here.
The Business Case for Autonomous Cloud Management
.png)
In this article, we delve into the reintroduction of autonomous infrastructure within the realm of large-scale systems, shedding light on the transformative changes that have taken place in infrastructure development over the past few decades. The insights shared here are the result of a collaborative effort involving multiple speakers who engaged in a thought-provoking discussion, comparing the distinctions between traditional monolithic systems and emerging paradigms such as Kubernetes and serverless environments. The focus lies on highlighting the advantages offered by enhanced flexibility and access to a diverse range of tools.
As the conversation unfolds, we shift our attention towards the crucial role that autonomous systems play in this dynamic landscape and why relying solely on automation falls short of our aspirations. By embracing autonomy, we open up new possibilities to transcend the previously inevitable trade-offs that accompanied investment decisions, business choices, and market strategies. We reflect on the age-old dilemma where one must choose between something being good, fast, or cheap, realizing that achieving all three simultaneously has long been an elusive goal. However, our speakers firmly believe that with advancements in technology and the successful implementation of autonomy, these trade-offs can be overcome.
Making Your Kubernetes Applications Autonomous
.png)
In the past few years, we have seen a huge increase in adoption for Kubernetes. Kubernetes is an excellent platform, which provides you with all the facilities to fine-tune your application and run it the way you need. We at Sedai, use Kubernetes for deploying our applications along with surveillance and some other managed services from AWS. One question we get a lot is “Is there a need for another system or an autonomous system to manage Kubernetes?” Kubernetes is in fact a declarative system, an excellent one at which you can specify the needs of running your application, and Kubernetes will take care of the rest.
Cutting Serverless Latency by 50 with Autonomous Optimization
.png)
In this article, we will explore the world of optimization, with a special focus on serverless functions, particularly those leveraging Lambda. We will dive into how Sedai, a cutting-edge solution, helps in optimizing serverless functions. Sedai offers invaluable insights and tools that enable businesses to identify and address niche areas where optimization can make a significant impact.
Moving to NoOps with Autonomous and Serverless
%20(1920%20%C3%97%201080px).png)
Today, we're going to explore the concept of running a software company without the need for an operations team or an SRE team. We'll discuss the evolving landscape of operations and its implications for developers. This discussion will be presented from a developer's perspective.
Achieving Autonomous Management with Datadog
.png)
Datadog can be utilized to achieve autonomous cloud environments in just 15 minutes. We'll discuss the partnership between Sedai and DataDog, the integration they have developed, and how customers can leverage Datadog to automate manual processes. The goal is to guide customers towards a fully autonomous system. By following our guidance, you can transform your investment in Datadog into a self-driving engine. Stay tuned as we reveal the steps to unlock the full potential of Datadog and empower your cloud environments.
Autonomous Builds on Observability: Picking the right metrics
.png)
Learn strategies for enhanced system performance and reliability. See how observability metrics lay the foundation for autonomous systems, and how Palo Alto Networks approaches production metrics. Explore the potential problems associated with an overwhelming influx of data and metrics and how ML-driven correlation analysis can identify truly predictive metrics in an environment with multiple monitoring providers, multiple resource types, real-time data stress and predictive analytics. See how Palo Alto Network uses a multi-stage approach that includes a notice filter, aggregation, analysis, auto-remediation and notifications.
The End of Cold Starts: Autonomous Concurrency for AWS Lambda
.png)
A comprehensive overview of how Sedai addresses the challenge of cold starts through the implementation of autonomous concurrency. We explore the history of cold starts, their underlying causes, and their relationship with concurrency before showing how an AI based approach can resolve cold starts.
Using Kubernetes Autoscalers to Optimize for Cost and Performance
.png)
We explore the key role of autoscaling in optimizing performance and cost within Kubernetes. Specifically, we delve into two critical autoscalers—Horizontal Pod Autoscaler (HPA) and Vertical Pod Autoscaler (VPA)—and shed light on their functionalities, benefits, and limitations.
The Answer Isn’t Shift Left or Shift Right — It’s Shift Up

Microservices architectures are rapidly becoming the new norm architects rely on when it comes to cloud computing. There has been a lot of debate whether it's best to shift left, or shift right. With Microservices, organizations must shift up, and manage their systems autonomously.
[VIDEO] Join the Autonomous Movement
![[VIDEO] Join the Autonomous Movement](https://cdn.prod.website-files.com/622926e1a85e0cb10d8f5d5d/622fba429d8bdf5d262aecde_Screen%20Shot%202022-03-14%20at%202.57.08%20PM.png)
From ensuring the highest levels of uptime availability to optimizing your code releases and cloud costs, learn how Sedai's autonomous cloud platform can become a staple in your SRE tool kit.
[VIDEO] The Autonomous Cloud Platform Built for SREs
![[VIDEO] The Autonomous Cloud Platform Built for SREs](https://cdn.prod.website-files.com/622926e1a85e0cb10d8f5d5d/622fb958c2fc987c684c2732_Screen%20Shot%202022-03-14%20at%202.52.50%20PM.png)
Sedai automatically discovers resources, intelligently analyzes traffic and performance metrics and continuously manages your production environments — without manual thresholds or human intervention. Try the autonomous cloud platform for free at sedai.io
Smart Recommendations for a Serverless-first Strategy

Div Shekhar, AWS Solution Architect, shares how AWS customers Coca-Cola, Nielsen Marketing Cloud and Lego are driving agility, increasing performance and improving security with a serverless strategy. Suresh Mathew, founder of Sedai, also shares the benefits of continuous and autonomous management of Lambda environments.
Sedai to Open New R&D Center in India

Today we announced our Series A funding, and we are thrilled to announce we are opening a new product engineering division in Thiruvananthapuram to advance the autonomous movement.