Introducing Trace, a new add-on

If you’re used to code micro-services with Node.js, you may be familiar with the frustration of performance issues, finding the root cause of an error and hunting down memory leaks.

That’s why services like Trace are here. Trace is a Node.js performance monitoring tool which helps you to understand how your application behaves and lets you find performance bottlenecks with ease.

Today we are launching Trace as an add-on for Clever Cloud, which can be tried out for free until July 1st without any restrictions.

Distributed tracing

Trace comes with a unique feature called distributed tracing: It visualizes whole transactions (request-chains) in your microservices architecture and looks for errors.

This means that you can see on an interactive timeline:

  • which services were taking part in a transaction,
  • how big were the response network delays,
  • how long did a service handled a request,
  • where and when the errors happened in that transaction.

Trace connects services taking part in a request by attaching correlation IDs to them. This way you can visualize the exact data-flow of faulty transactions, see the dependencies between your microservices, look for bad status codes and localize ongoing issues.

Service mapping

Trace provides you with a dynamic service map, which is automatically generated based on how the services in your system communicate with each other, or with your databases and external APIs.

Thanks to this, you can see how your application really looks like, and understand what makes it to slow down. The service map also allows you to find out how many requests your services handle and how big are their response times.

Getting started

Adding Trace to your services is possible with just a couple lines of code, and it can be installed and used in under two minutes. There is a tutorial for the installation on Trace’s documentation:

Some Use cases

Using Trace to fix crashing Node.js apps:

Using Trace to investigate slow Node.js apps:


À lire également

Deploy llama, mistral, openchat or your own model on Clever Cloud

If AI has been in the news recently, it's mainly through large language models (LLMs), which require substantial computing power for training or inference. Thus, it's common to see developers using them via turnkey APIs, as we did in a previous article. But this is changing thanks to more open and efficient models and tools, which you can easily deploy on Clever Cloud.

Our consortium, InfrateX, wins Simpl

The InfrateX Consortium, led by Sopra Steria with NTT Data and including Clever Cloud as a member, has won the significant contract Simpl.
Company Press

Our end-of-year events

The end of the year is already shaping up to be full of events, trade…