Back to: ASP.NET Core Web API Tutorials
How to Configure Elasticsearch in ASP.NET Core Web API
In this article, I will discuss how to Configure Elasticsearch in an ASP.NET Core Web API Application with Examples. Please read our previous article on the Differences Between Serilog and NLog in an ASP.NET Core Web API Application with Examples.
Log Analyzing Tools in ASP.NET Core
ASP.NET Core has built‑in support for logging via Microsoft.Extensions.Logging API that provides basic logging functionality. While this is sufficient for simple scenarios, production environments, especially with high‑volume, distributed systems, often require more sophisticated solutions for log analysis. For example, developers need tools that capture logs and help search, filter, and visualize them. Advanced log analyzers and visualization platforms are often integrated to help with the following:
- Diagnose issues and monitor performance.
- Search and analyze logs quickly.
- Visualize log data to gain insights into application behavior.
Commonly Used Log Analysis Tools:
The following are some of the Commonly Used Log Analysis Tools that can be used with ASP.NET Core Web Applications.
Serilog:
Serilog is a structured logging library that handles logs in various formats, including JSON and text. It can route logs to multiple destinations (called sinks), such as files, databases, or search engines like Elasticsearch, Seq, etc. With its support for structured data, Serilog simplifies filtering and querying logs by specific fields, making troubleshooting more efficient. It’s also widely supported by ASP.NET Core and can be integrated into your existing logging pipeline.
Seq:
Seq is a log server that logs structured events, typically from Serilog. It provides a rich, user-friendly web interface where you can query logs, set up alerts, and correlate events. For local development and smaller teams, Seq is a straightforward, powerful option that requires minimal configuration.
ELK Stack (Elasticsearch, Logstash, Kibana):
The ELK stack combines three open‑source projects to manage, process, and visualize large amounts of log data. It consists of:
- Elasticsearch: A distributed, RESTful search and analytics engine that stores and indexes log data as JSON documents for fast retrieval.
- Logstash: A log pipeline tool that collects logs from various sources, transforms (e.g., parsing, enrichment) the log data, and then forwards the data to a destination like Elasticsearch.
- Kibana: A web‑based visualization tool that sits on top of Elasticsearch. It provides the user interface to visualize and interact with the indexed log data stored in Elasticsearch.
What is Elasticsearch?
Elasticsearch is a distributed, RESTful search and analytics engine designed for:
- Storing and indexing large volumes of data: It stores data as JSON documents in a NoSQL format, which allows for a flexible schema.
- Near real‑time search: This enables quick search responses even with high‑volume data within seconds.
- Full‑text search and aggregations: It supports complex queries and data aggregations, making it ideal for both log analysis and general data analytics.
Elasticsearch is often used for log analytics, full‑text search, and real‑time data analysis. It forms the storage and search backbone of the ELK stack.
What is Logstash?
Logstash is a robust data processing pipeline that:
- Collects Data from Multiple Sources: It can collect logs from files, databases, and network sources.
- Transforms Data: Logstash parses, filters, and reformat log data.
- Forwards Data to Elasticsearch: After processing, it sends the structured data to Elasticsearch for storage and indexing.
This transformation layer makes it easier to standardize log formats, ensuring that Elasticsearch receives well‑structured data.
What is Kibana?
Kibana is a visualization and exploration tool designed to work with Elasticsearch. It provides:
- Interactive Dashboards: Users can build custom dashboards to monitor key metrics and trends.
- Powerful Search and Filter Capabilities: Using Kibana Query Language (KQL), you can build complex queries to drill down into your logs.
- Real‑Time Analysis: Kibana updates visualizations as new data is indexed, making it ideal for monitoring live systems.
Kibana is the face of the ELK stack. It lets you see, explore, and understand the data stored in Elasticsearch.
Implementing Elasticsearch in an ASP.NET Core Web API with Serilog
Let us proceed and see how to implement Elasticsearch logging locally on a Windows machine using the ASP.NET Core Web API Application with Logging using Serilog.
Set Up Elasticsearch Locally:
Let’s see how to set up Elasticsearch on our local machine. Please download and install Elasticsearch from the official website. First, visit the following URL.
https://www.elastic.co/downloads/elasticsearch
Once you visit the above URL, it will open the following page. From this page, download the required setup based on your Operating System. I am using Windows, so I selected Windows and downloaded the Elastic Search Zip file, as shown in the image below.
Extract the ZIP Archive
Once the download completes, extract the ZIP file’s contents to a location of your choice. For example, you might extract it to: D:\Elasticsearch
After extraction, you will see several directories, including bin, config, data, and logs, as shown in the image below.
Configure Elasticsearch
Open the config directory (e.g., D:\Elasticsearch\config\) and locate the elasticsearch.yml file. You can leave it as-is for default settings or modify certain parameters, such as:
Enable Security:
- xpack.security.enabled: false (By default it is true, make it to false). When set to true, it requires an SSL certificate.
Network settings (if you need to change the binding IP or port):
- network.host: 192.168.0.1
- http.port: 9200
Save your changes and close the file.
Run Elasticsearch
Open a Command Prompt or PowerShell window. In the command prompt, execute the following command to navigate to the bin directory of the extracted Elasticsearch folder: D:\Elasticsearch\bin.
Then, run Elasticsearch by executing the elasticsearch.bat command in the command prompt, as shown in the image below.
It will take some time, and then we will start Elasticsearch and bind it to localhost:9200 by default.
Verify Installation
Once Elasticsearch starts, open your web browser and navigate to the URL http://localhost:9200/. As shown in the image below, you should see a JSON response with information about your Elasticsearch node, including the version, cluster name, and more.
Create an ASP.NET Core Web API Project
Open Visual Studio and create a new ASP.NET Core Web API project, giving the project name ElasticSearchDemo. Once you create the Project, please install the following packages using the Package Manager Console in Visual Studio.
- Install-Package Serilog.AspNetCore
- Install-Package Serilog.Sinks.Console
- Install-Package Serilog.Sinks.Elasticsearch
- Install-Package Serilog.Settings.Configuration
Configuring Serilog via appsettings.json
Please modify your appsettings.json file with the Serilog settings. This configuration tells Serilog to write logs both to the Console and to an Elasticsearch instance running on http://localhost:9200. It also specifies the minimum logging level.
{ "Serilog": { "MinimumLevel": { "Default": "Information", "Override": { "Microsoft": "Warning", "System": "Warning" } }, "WriteTo": [ { "Name": "Console" }, { "Name": "Elasticsearch", "Args": { "nodeUris": "http://localhost:9200", // URL where Elasticsearch is running "autoRegisterTemplate": true, // Automatically registers an index template in Elasticsearch "indexFormat": "aspnetcore-logs-{0:yyyy.MM.dd}" // Defines the naming pattern for the Elasticsearch index. (daily indexes in this example). } } ], "Properties": { "Application": "MyAspNetCoreApp" } }, "AllowedHosts": "*" }
Elasticsearch Configuration Explanation:
- WriteTo: Defines where the logs should be written. In this case, to the Console and Elasticsearch.
- nodeUris: Specifies the URI of your Elasticsearch server.
- autoRegisterTemplate: When set to true, the sink automatically registers a template in Elasticsearch to configure index mappings for your logs.
- indexFormat: Determines the naming convention for indexes. This example creates daily indexes named with the date (e.g., aspnetcore-logs-2025.02.22).
Configuring Program.cs to Use Serilog
Next, modify your Program.cs class file to set up the host builder to read from the configuration. This ensures that Serilog is initialized as early as possible in the application lifecycle.
using Serilog; namespace ElasticSearchDemo { public class Program { public static void Main(string[] args) { var builder = WebApplication.CreateBuilder(args); // Add services to the container. builder.Services.AddControllers() .AddJsonOptions(options => { // Preserve original property names during JSON serialization/deserialization. options.JsonSerializerOptions.PropertyNamingPolicy = null; }); // Replace default logging with Serilog // Configure Serilog using the settings from appsettings.json builder.Host.UseSerilog((context, services, configuration) => { configuration.ReadFrom.Configuration(context.Configuration); }); builder.Services.AddEndpointsApiExplorer(); builder.Services.AddSwaggerGen(); var app = builder.Build(); // Configure the HTTP request pipeline. if (app.Environment.IsDevelopment()) { app.UseSwagger(); app.UseSwaggerUI(); } app.UseHttpsRedirection(); app.UseAuthorization(); app.MapControllers(); app.Run(); } } }
Note: The UseSerilog extension method replaces the default logging with Serilog and ReadFrom.Configuration(context.Configuration) tells Serilog to use the settings from the appsettings.json configuration file.
Creating a Controller with Different Log Levels and Structured Logging
Let us create a controller to demonstrate logging at various levels using Serilog. Create an API Empty Controller named LoggingDemoController within the Controllers folder and copy and paste the following code.
using Microsoft.AspNetCore.Mvc; namespace MyAspNetCoreApp.Controllers { [ApiController] [Route("[controller]")] public class LoggingDemoController : ControllerBase { private readonly ILogger<LoggingDemoController> _logger; public LoggingDemoController(ILogger<LoggingDemoController> logger) { _logger = logger; } [HttpGet("loglevels")] public IActionResult LogAllLevels() { // Trace-level logging: very detailed diagnostic information. _logger.LogTrace("LogTrace: Entering the LogAllLevels endpoint with Trace-level logging."); // Debug-level logging: useful for debugging during development. int calculation = 5 * 10; _logger.LogDebug("LogDebug: Calculation value is {calculation}", calculation); // Information-level logging with structured data. var employeeInfo = new { Id = 1, Name = "Pranaya", Department = "IT" }; _logger.LogInformation("LogInformation: Employee info: {@employeeInfo}", employeeInfo); // Warning-level logging: indicates a potential issue. bool isTakingMoreTime = true; if (isTakingMoreTime) { _logger.LogWarning("LogWarning: External API is taking more time to respond. Action may be required soon."); } try { // Simulate an error scenario (e.g., division by zero) int x = 0; int result = 10 / x; } catch (Exception ex) { _logger.LogError(ex, "LogError: An error occurred while processing the request."); } // Critical-level logging: indicates a failure in the application that requires immediate attention. bool criticalFailure = true; if (criticalFailure) { _logger.LogCritical("LogCritical: A critical system failure has been detected. Immediate attention is required."); } return Ok("All logging levels demonstrated in this endpoint."); } } }
Running the Application
First, ensure Elasticsearch is Running. Then, start your application from Visual Studio.
Test the Controller:
Open your browser or use a tool like Postman or Swagger to access the endpoint below. Please replace the port number with the port on which your application is running.
http://localhost:<port>/LoggingDemo/loglevels
Check your Console output
You can also verify that the log entries are being sent to Elasticsearch by making a request to the following Elastic Search endpoint:
http://localhost:9200/aspnetcore-logs-2025.02.22/_search?pretty
This query will return a JSON response containing the documents (log entries) stored in the aspnetcore-logs-2025.02.22 index.
What Is Kibana?
Kibana is a web‑based visualization and analytics tool that works hand‑in‑hand with Elasticsearch. It provides:
- A Graphical User Interface (GUI): Easily interact with data stored in Elasticsearch.
- Dashboards and Visualizations: Create charts, graphs, and dashboards to monitor application performance or troubleshoot issues.
- Real‑Time Analysis: Filter and search logs interactively using Kibana Query Language (KQL).
Step1: Download Kibana
Visit the Official Kibana Download Page: Open your web browser and navigate to the Kibana download page.
https://www.elastic.co/downloads/kibana
Choose the Appropriate Version: Select the Windows distribution that matches your Elasticsearch version. Download the ZIP file, as shown in the image below.
Step 2: Extract the Package
Once the download is complete, extract the contents of the ZIP file to a preferred directory (for example, D:\Kibana) as shown in the image below.
Step 3: Configure Kibana
Navigate to the extracted folder and open the config folder. Then, open the kibana.yml file using a text editor (e.g., Notepad or Visual Studio Code).
Set Elasticsearch Host: Locate the setting for elasticsearch.hosts and ensure it points to your running Elasticsearch instance. For example, if Elasticsearch is running locally on port 9200, update or uncomment the line:
elasticsearch.hosts: [“http://localhost:9200”]
Adjust Additional Settings (Optional): You can configure other settings as needed, such as:
server.port (default is 5601)
server.host (default is “localhost”)
If there are any other parameters based on your environment, save the changes.
Modify node.options config file
Please open the node.options config file and then remove or comment the following settings. I am removing the following.
## enable OpenSSL 3 legacy provider
–openssl-legacy-provider
Step 4: Run Kibana
Open Command Prompt or PowerShell: Press Win + R, type cmd, and press Enter. Navigate to the Kibana bin Directory (D:\Kibana\bin) by executing the below command in the command prompt.
Start Kibana: Run the kibana.bat command in the command prompt as shown in the below image to start Kibana:
Kibana will start and begin logging its startup process. Wait until you see a message indicating Kibana is ready to accept connections.
Step 5: Access Kibana
Open a web browser and navigate to http://localhost:5601. You may see a welcome screen; choose to Explore on my own to start analyzing your log data immediately.
When to Use Elasticsearch in ASP.NET Core?
Elasticsearch is best suited for scenarios where traditional logging solutions struggle to keep up with volume or complexity. Some common use cases include:
- High-Volume Logging: Applications that produce large amounts of logs, especially in microservices environments, benefit from Elasticsearch’s scalability. Its distributed architecture allows logs to be stored and indexed in parallel, ensuring quick search performance.
- Complex Querying and Aggregation: If your system needs to run detailed queries, such as filtering logs by multiple fields or generating aggregations (e.g., finding the average response time or the count of specific error codes), Elasticsearch can handle these operations efficiently.
- Centralized Analysis Across Multiple Sources: In a distributed system, logs often come from various services, containers, or virtual machines. Elasticsearch centralizes these logs and provides a unified view, simplifying debugging and root cause analysis.
- Integration with Visualization Tools: With Kibana, Elasticsearch offers a straightforward way to turn raw log data into interactive graphs, heatmaps, and other visualizations. This visual aspect helps non-technical stakeholders understand trends and spot anomalies.
- Advanced Alerting and Monitoring: By integrating with tools like Watcher (a part of the Elastic Stack), you can set up alerts for specific patterns or conditions in your logs, such as repeated login failures or high latency on certain endpoints.
In the next article, I will discuss how to Implement Caching in an ASP.NET Core Web API Application. In this article, I explain Elasticsearch in an ASP.NET Core Web API application with examples. I hope you enjoy this article, “How to Integrate Elasticsearch in an ASP.NET Core Web API Application.“