John Davidson

php - Would random BIGINT used as a primary key significantly affect performance?

0 comments
Message:


Background:


I'm trying to find the best way to store IDs in APIs. UUID got a lot of performance-related issues, and incremental IDs are not safe in some cases. After experimenting with different approaches I found random BIGING (PHP example) the best compromise between performance and uniqueness.


Question:


Would random BIGINT (example 9007199254740991) used as a primary key, affect significantly query performance when it comes to operations like joins?


On small databases, it performs well but I never had a chance to test it on large production databases.



My Answer:
Using a random BIGINT as a primary key can have an impact on query performance, especially when it comes to operations like joins.

When using a random BIGINT as a primary key, the data will not be stored in a sequential order, which can lead to fragmentation in the data storage. This can result in slower query performance, especially when performing joins on large datasets.

Additionally, using a random BIGINT as a primary key can also lead to increased storage requirements, as the values are not sequential and may not be as compact as incremental IDs.

It is important to consider the specific requirements of your application and database when choosing a primary key. If performance is a concern, you may want to consider other options such as using incremental IDs or a combination of different keys to ensure uniqueness and performance.

Rate this post

3 of 5 based on 2385 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us