Skip to content

Latest commit

 

History

History
88 lines (72 loc) · 2.47 KB

File metadata and controls

88 lines (72 loc) · 2.47 KB
endpoint bulk
lang dotnet
es_version 9.3
client Elastic.Clients.Elasticsearch==9.3.0

Elasticsearch 9.3 bulk endpoint (.NET example)

Use BulkRequest with BulkIndexOperation<T> to index multiple documents in a single request. The client serializes typed objects to JSON automatically.

public record Product(
    string Name, string Brand, double Price,
    string Category, bool InStock, double Rating);

var products = new List<Product>
{
    new("Espresso Machine Pro",       "BrewMaster",  899.99, "appliances",  true,  4.7),
    new("Noise-Cancelling Headphones", "SoundCore",  249.00, "electronics", true,  4.5),
    new("Ergonomic Standing Desk",     "DeskCraft",  599.00, "furniture",   false, 4.8),
    new("4K Webcam with Mic",          "StreamGear", 129.99, "electronics", true,  4.3),
    new("Cast Iron Dutch Oven",        "HearthStone", 79.95, "cookware",   true,  4.9),
    new("Mechanical Keyboard",         "TypeForce",  169.00, "electronics", true,  4.6),
    new("Air Purifier HEPA-13",        "CleanAir",   349.00, "appliances",  true,  4.4),
    new("Bamboo Cutting Board Set",    "HearthStone", 34.99, "cookware",   true,  4.2),
};

var request = new BulkRequest("products")
{
    Operations = new List<IBulkOperation>()
};

for (int i = 0; i < products.Count; i++)
{
    request.Operations.Add(
        new BulkIndexOperation<Product>(products[i]) { Id = $"prod-{i + 1}" }
    );
}

var response = await client.BulkAsync(request);
Console.WriteLine($"Indexed {response.Items.Count} documents");

The Id property is optional — omit it to let Elasticsearch generate IDs automatically.

Handling errors

The response always returns, even when individual documents fail. Check the Errors flag and iterate over Items to find failures:

var response = await client.BulkAsync(request);

if (response.Errors)
{
    foreach (var item in response.Items)
    {
        if (item.Error is not null)
        {
            Console.WriteLine($"Failed {item.Id}: {item.Error.Reason}");
        }
    }
}

Large datasets

For large collections, use BulkAll which automatically partitions documents into batches with configurable parallelism and retry logic:

var bulkAll = client.BulkAll(products, b => b
    .Index("products")
    .BackOffRetries(3)
    .BackOffTime(TimeSpan.FromSeconds(1))
    .MaxDegreeOfParallelism(2)
    .Size(500)
);

bulkAll.Wait(TimeSpan.FromMinutes(5), next =>
{
    Console.WriteLine($"Indexed page {next.Page}");
});