Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 955687
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 16, 20262026-05-16T00:25:21+00:00 2026-05-16T00:25:21+00:00

So, I’m working on a database that I will be adding to my future

  • 0

So, I’m working on a database that I will be adding to my future projects as sort of a supporting db, but I’m having a bit of an issue with it, especially the logs.

The database basically needs to be updated once a month. The main table has to be purged and then refilled off of a CSV file. The problem is that Sql Server will generate a log for it which is MEGA big. I was successful in filling it up once, but wanted to test the whole process by purging it and then refilling it.

That’s when I get an error that the log file is filled up. It jumps from 88MB (after shrinking via maintenance plan) to 248MB and then stops the process altogether and never completes.

I’ve capped it’s growth at 256MB, incrementing by 16MB, which is why it failed, but in reality I don’t need it to log anything at all. Is there a way to just completely bypass logging on any query being run against the database?

Thanks for any responses in advance!

EDIT: Per the suggestions of @mattmc3 I’ve implemented SqlBulkCopy for the whole procedure. It works AMAZING, except, my loop is somehow crashing on the very last remaining chunk that needs to be inserted. I’m not too sure where I’m going wrong, heck I don’t even know if this is a proper loop, so I’d appreciate some help on it.

I do know that its an issue with the very last GetDataTable or SetSqlBulkCopy calls. I’m trying to insert 788189 rows, 788000 get in and the remaining 189 are crashing…

string[] Rows;

using (StreamReader Reader = new StreamReader("C:/?.csv")) {
    Rows = Reader.ReadToEnd().TrimEnd().Split(new char[1] {
        '\n'
     }, StringSplitOptions.RemoveEmptyEntries);
};

int RowsInserted = 0;

using (SqlConnection Connection = new SqlConnection("")) {
    Connection.Open();

    DataTable Table = null;

    while ((RowsInserted < Rows.Length) && ((Rows.Length - RowsInserted) >= 1000)) {
        Table = GetDataTable(Rows.Skip(RowsInserted).Take(1000).ToArray());

        SetSqlBulkCopy(Table, Connection);

        RowsInserted += 1000;
    };

    Table = GetDataTable(Rows.Skip(RowsInserted).ToArray());

    SetSqlBulkCopy(Table, Connection);

    Connection.Close();
};

static DataTable GetDataTable(
    string[] Rows) {
    using (DataTable Table = new DataTable()) {
        Table.Columns.Add(new DataColumn("A"));
        Table.Columns.Add(new DataColumn("B"));
        Table.Columns.Add(new DataColumn("C"));
        Table.Columns.Add(new DataColumn("D"));

        for (short a = 0, b = (short)Rows.Length; a < b; a++) {
            string[] Columns = Rows[a].Split(new char[1] {
                ','
            }, StringSplitOptions.RemoveEmptyEntries);

            DataRow Row = Table.NewRow();

            Row["A"] = Columns[0];
            Row["B"] = Columns[1];
            Row["C"] = Columns[2];
            Row["D"] = Columns[3];

            Table.Rows.Add(Row);
        };

        return (Table);
    };
}

static void SetSqlBulkCopy(
    DataTable Table,
    SqlConnection Connection) {
    using (SqlBulkCopy SqlBulkCopy = new SqlBulkCopy(Connection)) {
        SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("A", "A"));
        SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("B", "B"));
        SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("C", "C"));
        SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("D", "D"));

        SqlBulkCopy.BatchSize = Table.Rows.Count;
        SqlBulkCopy.DestinationTableName = "E";
        SqlBulkCopy.WriteToServer(Table);
    };
}

EDIT/FINAL CODE: So the app is now finished and works AMAZING, and quite speedy! @mattmc3, thanks for all the help! Here is the final code for anyone who may find it useful:

List<string> Rows = new List<string>();

using (StreamReader Reader = new StreamReader(@"?.csv")) {
    string Line = string.Empty;

    while (!String.IsNullOrWhiteSpace(Line = Reader.ReadLine())) {
        Rows.Add(Line);
    };
};

if (Rows.Count > 0) {
    int RowsInserted = 0;

    DataTable Table = new DataTable();

    Table.Columns.Add(new DataColumn("Id"));
    Table.Columns.Add(new DataColumn("A"));

    while ((RowsInserted < Rows.Count) && ((Rows.Count - RowsInserted) >= 1000)) {
        Table = GetDataTable(Rows.Skip(RowsInserted).Take(1000).ToList(), Table);

        PerformSqlBulkCopy(Table);

        RowsInserted += 1000;

        Table.Clear();
    };

    Table = GetDataTable(Rows.Skip(RowsInserted).ToList(), Table);

    PerformSqlBulkCopy(Table);
};

static DataTable GetDataTable(
    List<string> Rows,
    DataTable Table) {
    for (short a = 0, b = (short)Rows.Count; a < b; a++) {
        string[] Columns = Rows[a].Split(new char[1] {
            ','
        }, StringSplitOptions.RemoveEmptyEntries);

        DataRow Row = Table.NewRow();

        Row["A"] = "";

        Table.Rows.Add(Row);
    };

    return (Table);
}

static void PerformSqlBulkCopy(
    DataTable Table) {
    using (SqlBulkCopy SqlBulkCopy = new SqlBulkCopy(@"", SqlBulkCopyOptions.TableLock)) {
        SqlBulkCopy.BatchSize = Table.Rows.Count;
        SqlBulkCopy.DestinationTableName = "";
        SqlBulkCopy.WriteToServer(Table);
    };
}
  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-16T00:25:21+00:00Added an answer on May 16, 2026 at 12:25 am

    If you are doing a Bulk Insert into the table in SQL Server, which is how you should be doing this (BCP, Bulk Insert, Insert Into...Select, or in .NET, the SqlBulkCopy class) you can use the “Bulk Logged” recovery model. I highly recommend reading the MSDN articles on recovery models: http://msdn.microsoft.com/en-us/library/ms189275.aspx

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 499k
  • Answers 500k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer This is not pretty but it works: rm -R $(ls… May 16, 2026 at 12:45 pm
  • Editorial Team
    Editorial Team added an answer Yes. Override the base1 and base2 methods in Derived to… May 16, 2026 at 12:45 pm
  • Editorial Team
    Editorial Team added an answer No, you can't. Unfortunately, UIEvent doesn't expose any public way… May 16, 2026 at 12:45 pm

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Related Questions

link Im having trouble converting the html entites into html characters, (&# 8217;) i
Seemingly simple, but I cannot find anything relevant on the web. What is the
That's pretty much it. I'm using Nokogiri to scrape a web page what has
I want to count how many characters a certain string has in PHP, but
I want use html5's new tag to play a wav file (currently only supported
Does anyone know how can I replace this 2 symbol below from the string
this is what i have right now Drawing an RSS feed into the php,
I'm trying to decode HTML entries from here NYTimes.com and I cannot figure out
I have just tried to save a simple *.rtf file with some websites and
I ran into a problem. Wrote the following code snippet: teksti = teksti.Trim() teksti

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.