Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 590653
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T15:33:09+00:00 2026-05-13T15:33:09+00:00

I am working on a data driven web application that uses a SQL 2005

  • 0

I am working on a data driven web application that uses a SQL 2005 (standard edition) database.

One of the tables is rather large (8 million+ rows large with about 30 columns). The size of the table obviously effects the performance of the website which is selecting items from the table through stored procs. The table is indexed but still the performance is poor due to the sheer amount of rows in the table – this is part of the problem – the table is as equally read as updated, so we can’t add / remove indexes without making one of the operations worse.

The goal I have here is to increase the performance when selecting items from the table. The table has ‘current’ data and old / barely touched data. The most effective solution we can think of at this stage is to seperate the table into 2, i.e, one for old items (before a certain date, say 1 Jan 2005) and one for newer items (equal to or before 1 Jan 2005).

We know of things like Distributed Partitioned Views – but all of these features require Enterprise Edition, which the client will not buy (and no, throwing hardware at it isn’t going to happen either).

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T15:33:09+00:00Added an answer on May 13, 2026 at 3:33 pm

    You can always roll your own “poor man’s partitioning / DPV,” even if it doesn’t smell like the right way to do it. This is just a broad conceptual approach:

    1. Create a new table for the current year’s data – same structure, same indexes. Adjust the stored procedure that writes to the main, big table to write to both tables (just temporarily). I recommend making the logic in the stored procedure say IF CURRENT_TIMESTAMP >= ‘[some whole date without time]’ – this will make it easy to backfill the data in this table which pre-dates the change to the procedure that starts logging there.

    2. Create a new table for each year in your history by using SELECT INTO from the main table. You can do this in a different database on the same instance to avoid the overhead in the current database. Historical data isn’t going to change I assume, so in this other database you could even make it read only when it is done (which will dramatically improve read performance).

    3. Once you have a copy of the entire table, you can create views that reference just the current year, another view that references 2005 to the current year (by using UNION ALL between the current table and those in the other database that are >= 2005), and another that references all three sets of tables (those mentioned, and the tables that pre-date 2005). Of course you can break this up even more but I just wanted to keep the concept minimal.

    4. Change your stored procedures that read the data to be “smarter” – if the date range requested falls within the current calendar year, use the smallest view that is only local; if the date range is >= 2005 then use the second view, else use the third view. You can follow similar logic with stored procedures that write, if you are doing more than just inserting new data that is relevant only to the current year.

    5. At this point you should be able to stop inserting into the massive table and, once everything is proven to be working, drop it and reclaim some disk space (and by that I mean freeing up space in the data file(s) for reuse, not performing a shrink db – since you will use that space again).

    I don’t have all of the details of your situation but please follow up if you have questions or concerns. I have used this approach in several migration projects including one that is going on right now.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 365k
  • Answers 365k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer You can use the data in the /proc filesystem to… May 14, 2026 at 3:57 pm
  • Editorial Team
    Editorial Team added an answer Do you still get the error when you use a… May 14, 2026 at 3:57 pm
  • Editorial Team
    Editorial Team added an answer You can use array_chunk to create a single array comprised… May 14, 2026 at 3:57 pm

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.