Apesar de muitas aplicações e serviços se comunicarem através de HTTP, há algum tempo é possível notar o crescimento do uso do protocolo gRPC, principalmente em casos onde a eficiência e desempenho são críticos para uma aplicação ou serviço.
Mas toda esta eficiência pode ser comprometida ao deixar de implementar corretamente um mecanismo de Health Check, principalmente utilizando Kubernetes. E é este tema que vamos abordar hoje neste post, através de um exemplo simples de serviço gRPC com tratamento correto de health check, para uso no Kubernetes.
By the end of this post, you’ll be able to:
On this post, we will show you how to install and run Spark Jobs…
Process automation is a constant exercise, we should be always aware of opportunities to automate tasks and optimize processes. And with this post, we want to share one of those occasions, which was the creation of a simple but effective tool to validate and import Open APIs (Swagger) into the Kong API Gateway, optimizing the deployment process of our customers.
At first glance, creating APIs seems an easy task to be performed by a member of the development team, however, with an increasing number of APIs, it can end up being a repetitive, costly and error prone proccess. …
Nesta série de posts, iremos demonstrar diversas técnicas e conceitos sobre como aplicar e usufruir dos benefícios do AWS Well-Architected framework com o Elastic Kubernetes Service, mais conhecido como EKS.
O nosso objetivo não é fornecer uma explicação detalhada do framework em si, mas sim dar uma visão ampla do mesmo para que possamos então aplicar os principais conceitos. Para um entendimento mais detalhado, recomendamos fortemente a leitura do guia oficial.
E por falar na documentação oficial, extraímos alguns trechos que descrevem bem sobre o que é e para o que serve o framework:
“O AWS Well-Architected Framework é um…
Today we will try to explain one of the options of Confluent Kafka’s authentication mechanisms, the SASL OAuthBearer authentication with ACLs for authorization.
In the Confluent Kafka default installation there is no encryption, authentication, or authorization configured. All the components communicate freely in plain text with any topic. This can be a big risk for the business, depending on the criticality of the information being transferred. In this case, it is crucial to have the security components configured and working properly.
Before jump to the OAuthBearer mechanism, it is important show the options that Confluent Kafka supports over security, below…
Kubernetes have some of the most active open source communities, and it is not new that has been consolidated as the standard for containers orchestration. For these and several other reasons, it is essential to know the core components of Kubernetes and their interactions. One of the most well-known and effective methods to acquire this knowledge base, and even to study to CKA certification (Certified Kubernetes Administrator), it is through the Kubernetes the Hardway by Kelsey Hightower, by configuring a cluster from scratch.
This post aims to give an overview of the tutorial and notes on some points that most…
In this third part of the article API First: From Zero to Hero (Part 1), we will integrate the security layer of the API, covering all the steps for building complete APIs using the API First strategy.
We now have end-to-end integration testing of our API, however, we also need to protect our API from unauthenticated access.
For this, we will use the OIDC framework (OpenID Connect v1.0), integrating the API Gateway (Kong) with the IAM solution (Keycloak).
In this second part of the API First: From Zero to Hero (Part 1) article, we will level up by provisioning an API Gateway, import the API and run our API tests against the gateway.
Moving on with the integration tests, we will import our API into the API Gateway locally, to validate the import process and enable the use of our test through the gateway. We will have something similar to the diagram below:
To facilitate the provisioning of resources, we will start to use a
Makefile to centralize the scripts and enable the reuse of them.
To develop and manage microservices APIs following the best practices and open standards, ensuring quality and security, we created this hands-on article demonstrating how we can use opensource tools and common techniques to make it possible to enable an ‘API First’ approach.
Follow the requirements to run the examples demonstrated in this tutorial:
We will use the petstore-v3.0.yaml spec as our OpenApi contract.
Before we start, we must advise you that in this article we are not going to cover specific details about testing techniques, for that we understand that are many formidable sources on the internet.
For some, when they hear the word ‘QAOps’ for the first time they might think: another (whatever)Ops term? Seriously?
Yes, but let me explain what is QAOps, why it is so important and why it is a quality assurance trend for the next years. We all agree that software quality plays a crucial role in pretty much every single ‘digital’ company across the world and even…