Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 1009977
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 16, 20262026-05-16T09:00:40+00:00 2026-05-16T09:00:40+00:00

Background: I’m working on a system where the developers seem to be using a

  • 0

Background: I’m working on a system where the developers seem to be using a function which executes a MYSQL query like "SELECT MAX(id) AS id FROM TABLE" whenever they need to get the id of the LAST inserted row (the table having an auto_increment column).

I know this is a horrible practice (because concurrent requests will mess the records), and I’m trying to communicate that to the non-tech / management team, to which their response is…

"Oh okay, we'll only face this problem when we have 
(a) a lot of users, or 
(b) it'll only happen when two people try doing something
    at _exactly_ the same time"

I don’t disagree with either point, and think we’ll run into this problem much sooner than we plan. However, I’m trying to calculate (or figure a mechanism) to calculate how many users should be using the system before we start seeing messed up links.

Any mathematical insights into that? Again, I KNOW its a horrible practice, I just want to understand the variables in this situation…


Update: Thanks for the comments folks – we’re moving in the right direction and getting the code fixed!

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-16T09:00:41+00:00Added an answer on May 16, 2026 at 9:00 am

    The point is not if potential bad situations are likely. The point is if they are possible. As long as there’s a non-trivial probability of the issue occurring, if it’s known it should be avoided.

    It’s not like we’re talking about changing a one line function call into a 5000 line monster to deal with a remotely possible edge case. We’re talking about actually shortening the call to a more readable, and more correct usage.

    I kind of agree with @Mark Baker that there is some performance consideration, but since id is a primary key, the MAX query will be very quick. Sure, the LAST_INSERT_ID() will be faster (since it’s just reading from a session variable), but only by a trivial amount.

    And you don’t need a lot of users for this to occur. All you need is a lot of concurrent requests (not even that many). If the time between the start of the insert and the start of the select is 50 milliseconds (assuming a transaction safe DB engine), then you only need 20 requests per second to start hitting an issue with this consistently. The point is that the window for error is non-trivial. If you say 20 requests per second (which in reality is not a lot), and assuming that the average person visits one page per minute, you’re only talking 1200 users. And that’s for it to happen regularly. It could happen once with only 2 users.

    And right from the MySQL documentation on the subject:

    You can generate sequences without calling LAST_INSERT_ID(), but the utility of 
    using the function this way is that the ID value is maintained in the server as 
    the last automatically generated value. It is multi-user safe because multiple 
    clients can issue the UPDATE statement and get their own sequence value with the
    SELECT statement (or mysql_insert_id()), without affecting or being affected by 
    other clients that generate their own sequence values.
    
    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 502k
  • Answers 502k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer I found a workaround here (web-archive-link, since site is down):… May 16, 2026 at 2:45 pm
  • Editorial Team
    Editorial Team added an answer If you pass a string to the Date constructor, the… May 16, 2026 at 2:45 pm
  • Editorial Team
    Editorial Team added an answer You can create a property on the user control to… May 16, 2026 at 2:45 pm

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Related Questions

Background: I'm using Google's protobuf , and I would like to read/write several gigabytes
Background: At my company we are developing a bunch applications that are using the
Background: Some time ago, I built a system for recording and categorizing application crashes
Background I am writing and using a very simple CGI-based (Perl) content management tool
Background I work for a large organization which has thousands of MS Access applications
Background : I currently have a Web Forms, ASP.NET 3.5/C# application which I'm interested
Background: I have a little video playing app with a UI inspired by the
Background: I need to reserve an amount of memory below 0xA0000 prior to my
Background I have a massive db for a SharePoint site collection. It is 130GB
Background I am trying to create a copy of a business object I have

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.