104  
csharp
Поиск  
Always will be ready notify the world about expectations as easy as possible: job change page
Nov 27, 2023

Insert 1 million dummy product data into SQL Server

Автор:
Источник:
Просмотров:
1939

Use the Bogus library to generate and insert 1 million dummy product data into the SQL Server database

C#

We need to create 1 million dummy product data into the SQL Server database, which can be used for development or performance testing purposes.

The Project

The project is a console application using .NET 6.0 as a framework.

The project name is InsertMillionRecords.

InsertMillionRecords

We will use the Bogus package to generate random product data.

We use Entity Framework Core as the data access layer.

Entity Framework Core

Product Model

To model the Products table in the database, we need to create the Product class:

public class Product
{
    public int Id { get; set; }
    public string Code { get; set; }
    public string Description { get; set; }
    public string Category { get; set; }
    public decimal Price { get; set; }
}

Entity Framework Data Context

Next, we create the Entity Framework data context class:

using Microsoft.EntityFrameworkCore;

namespace InsertMillionRecords;
public class DataContext : DbContext
{
    public DataContext(DbContextOptions<DataContext> options) : base(options)
    {
    }

    public DbSet<Product> Products { get; set; }
}

The Program.cs File

Initialize Data Context

First, we need to initialize the data context:

var connectionString = "Data Source=localhost; Initial Catalog=Product; Integrated Security=True";
var contextOptionsBuilder = new DbContextOptionsBuilder<DataContext>();
contextOptionsBuilder.UseSqlServer(connectionString);
var context = new DataContext(contextOptionsBuilder.Options);

We've made things simpler by hardcoding the connection string. No need to worry about it!

Create Database

Every time the script runs, we need to ensure that the database is recreated.

await context.Database.EnsureDeletedAsync();
await context.Database.EnsureCreatedAsync();

Setup Bogus Faker Class

First, we initialize the Faker<Product> object.

Next, we use the RuleFor() method to set up each property of the Product class.

The self-explained code:

var faker = new Faker<Product>();
faker.RuleFor(p => p.Code, f => f.Commerce.Ean13());
faker.RuleFor(p => p.Description, f => f.Commerce.ProductName());
faker.RuleFor(p => p.Category, f => f.Commerce.Categories(1)[0]);
faker.RuleFor(p => p.Price, f => f.Random.Decimal(1, 1000));

Generate 1 Million Dummy Product Data

var products = faker.Generate(1_000_000);

The products variable now contains 1 million of product data!

Create 10 Batches of Insertion

There is a possibility that a timeout will occur if we insert 1 million records at a time.

Therefore, we will split the process into 10 batches. Each batch will insert 100K records at a time.

var batches = products
    .Select((p, i) => (Product: p, Index: i))
    .GroupBy(x => x.Index / 100_000)
    .Select(g => g.Select(x => x.Product).ToList())
    .ToList();

Insert Each Batch into the Database

var count = 0;
foreach (var batch in batches)
{
    batchCount++;
    Console.WriteLine($"Inserting batch {count} of {batches.Count}...");

    await context.Products.AddRangeAsync(batch);
    await context.SaveChangesAsync();
}

The complete code of the Program.cs file:

using Bogus;
using InsertMillionRecords;
using Microsoft.EntityFrameworkCore;
using System.Diagnostics;

// initialize data context
var connectionString = "Data Source=localhost; Initial Catalog=Product; Integrated Security=True";
var contextOptionsBuilder = new DbContextOptionsBuilder<DataContext>();
contextOptionsBuilder.UseSqlServer(connectionString);
var context = new DataContext(contextOptionsBuilder.Options);

// create database
await context.Database.EnsureDeletedAsync();
await context.Database.EnsureCreatedAsync();

// setup bogus faker
var faker = new Faker<Product>();
faker.RuleFor(p => p.Code, f => f.Commerce.Ean13());
faker.RuleFor(p => p.Description, f => f.Commerce.ProductName());
faker.RuleFor(p => p.Category, f => f.Commerce.Categories(1)[0]);
faker.RuleFor(p => p.Price, f => f.Random.Decimal(1, 1000));

// generate 1 million products
var products = faker.Generate(1_000_000);

var batches = products
    .Select((p, i) => (Product: p, Index: i))
    .GroupBy(x => x.Index / 100_000)
    .Select(g => g.Select(x => x.Product).ToList())
    .ToList();

// insert batches
var stopwatch = new Stopwatch();
stopwatch.Start();

var count = 0;
foreach (var batch in batches)
{
    count++;
    Console.WriteLine($"Inserting batch {count} of {batches.Count}...");

    await context.Products.AddRangeAsync(batch);
    await context.SaveChangesAsync();
}

stopwatch.Stop();

Console.WriteLine($"Elapsed time: {stopwatch.Elapsed}");
Console.WriteLine("Press any key to exit...");

Run the Application

Now, let’s run the application. We can use Release mode to fasten the process.

It took 1 minute and 9 seconds on my machine:

Inserting batch

And now we have 1 million records in the Products table:

Products table

I am planning to use these dummy data for testing the full-text search feature in SQL Server.

The source code of this post can be found here: https://github.com/juldhais/InsertMillionRecords

Thanks for reading 👍

Похожее
Jul 8, 2021
Author: Christophe Nasarre
In the context of helping the teams at Criteo to clean up our code base, I gathered and documented a few C# anti-patterns similar to Kevin’s publication about performance code smell. Here is an extract related to good/bad memory patterns....
Jun 27
Author: Dayanand Thombare
Introduction Caching is a technique used to store frequently accessed data in a fast-access storage layer to improve application performance and reduce the load on backend systems. By serving data from the cache, we can avoid expensive database queries or...
May 16
Author: Paul Balan
Pagination is in front of us everyday yet we take it for granted kind of like we do with most things. It’s what chunks huge lists of data (blog posts, articles, products) into pages so we can navigate through data....
May 21
Author: Kostiantyn Bilous
Imagine a situation where you are working on an extensive financial application that processes thousands of transactions per minute. You discover that the system's performance has sharply declined at some point, and users begin to complain about delays. Upon analysis,...
Написать сообщение
Почта
Имя
*Сообщение


© 1999–2024 WebDynamics
1980–... Sergey Drozdov
Area of interests: .NET Framework | .NET Core | C# | ASP.NET | Windows Forms | WPF | HTML5 | CSS3 | jQuery | AJAX | Angular | React | MS SQL Server | Transact-SQL | ADO.NET | Entity Framework | IIS | OOP | OOA | OOD | WCF | WPF | MSMQ | MVC | MVP | MVVM | Design Patterns | Enterprise Architecture | Scrum | Kanban