Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 245901
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 11, 20262026-05-11T21:07:38+00:00 2026-05-11T21:07:38+00:00

I am trying to insert huge amount of data into SQL server. My destination

  • 0

I am trying to insert huge amount of data into SQL server. My destination table has an unique index called “Hash”.

I would like to replace my SqlDataAdapter implementation with SqlBulkCopy. In SqlDataAapter there is a property called “ContinueUpdateOnError”, when set to true adapter.Update(table) will insert all the rows possible and tag the error rows with RowError property.

The question is how can I use SqlBulkCopy to insert data as quickly as possible while keeping track of which rows got inserted and which rows did not (due to the unique index)?

Here is the additional information:

  1. The process is iterative, often set on a schedule to repeat.

  2. The source and destination tables can be huge, sometimes millions of rows.

  3. Even though it is possible to check for the hash values first, it requires two transactions per row (first for selecting the hash from destination table, then perform the insertion). I think in the adapter.update(table)’s case, it is faster to check for the RowError than checking for hash hits per row.

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-11T21:07:38+00:00Added an answer on May 11, 2026 at 9:07 pm

    SqlBulkCopy, has very limited error handling facilities, by default it doesn’t even check constraints.

    However, its fast, really really fast.

    If you want to work around the duplicate key issue, and identify which rows are duplicates in a batch. One option is:

    • start tran
    • Grab a tablockx on the table select all current “Hash” values and chuck them in a HashSet.
    • Filter out the duplicates and report.
    • Insert the data
    • commit tran

    This process will work effectively if you are inserting huge sets and the size of the initial data in the table is not too huge.

    Can you please expand your question to include the rest of the context of the problem.

    EDIT

    Now that I have some more context here is another way you can go about it:

    • Do the bulk insert into a temp table.
    • start serializable tran
    • Select all temp rows that are already in the destination table … report on them
    • Insert the data in the temp table into the real table, performing a left join on hash and including all the new rows.
    • commit the tran

    That process is very light on round trips, and considering your specs should end up being really fast;

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 169k
  • Answers 169k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer One problem to this approach is that it's very dangerous… May 12, 2026 at 1:57 pm
  • Editorial Team
    Editorial Team added an answer Ben, as for the second part of your question, there… May 12, 2026 at 1:57 pm
  • Editorial Team
    Editorial Team added an answer More likely than not, the plist is stored in a… May 12, 2026 at 1:57 pm

Related Questions

We are trying to set up a cursor to run through records generated from
I am trying to INSERT INTO a table using the input from another table.
I am trying to insert about 50,000 objects (and therefore 50,000 keys) into a
I am trying to insert a time only value, but get the following error

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.