I have a Laravel API and a front-end that's built in Nuxt JS. On one of my front-end pages, I'm displaying a paginated list of applications (customer data) from one of my database tables, a request in made to my endpoint and the function applications
goes and gets the data from the database in a paginated list.
The issue though, is it feels pretty slow considering that I'm not going much with the data, and my database table doesn't have that many columns.
The schema for my applications table is:
Schema::create('applications', function (Blueprint $table) {
$table->id();
$table->string('hash', 42)->index();
$table->string('product_type')->nullable();
$table->string('email')->nullable()->index();
$table->date('birthday')->nullable()->index();
$table->string('postcode')->nullable()->index();
$table->string('mobile')->nullable()->index();
$table->text('data');
$table->timestamps();
$table->timestamp('verified_at')->nullable();
});
And the functions in question are:
/**
* Applications
*/
public function getApplications()
{
$applications = Application::orderBy('created_at', 'desc')
->paginate(25);
foreach ($applications as $key => $applicant) {
try {
$applications[$key]['data'] = json_decode($applicant['data']);
} catch (\Exception $e) { }
}
return $applications;
}
/**
* All applications
*
* @param Request $request
*/
public function applications(Request $request)
{
try {
// daily graphs
$applications = $this->getApplications();
// not set or nothing to report
if (!$applications) {
return response()->json([
'success' => false,
'msg' => 'No applications found, check back in a few minutes',
], 422);
}
// return the response
return response()->json([
'success' => true,
'msg' => 'Icicle applications',
'applications' => $applications ?? null
], 200);
} catch (\Exception $e) {
// return default template
return response()->json([
'success' => false,
'msg' => 'We were unable to load applications right now',
], 422);
}
}
I've tried changing the paginate
to a simple get
and I've also tried running the same query via Tinker, I've further tried completely removing my foreach
loop to see if that boosts performance.
It's taking 7 seconds to fetch the data, of which I have 2 million records, I feel as if this should be much faster as it's a simple query, am I missing something in my query or the way I'm fetching data?
My Answer:
There are a few things you can try to improve the performance of your query:
1. Indexing: Make sure that the columns you are using in your query (such as 'created_at', 'email', 'birthday', 'postcode', and 'mobile') are indexed in your database table. This can significantly speed up the retrieval of data.
2. Eager Loading: If your 'data' column contains a large amount of data, consider using eager loading to load it only when necessary. You can use the `with()` method to eager load relationships in your query.
3. Select only necessary columns: Instead of selecting all columns in your query, only select the columns that you actually need. This can reduce the amount of data that needs to be retrieved from the database.
4. Use caching: Consider implementing caching to store the results of your query and retrieve them from the cache instead of making a database query every time.
5. Optimize your code: Look for any unnecessary loops or operations in your code that may be slowing down the query. Make sure that your code is optimized for performance.
By implementing these suggestions, you should be able to improve the performance of your query and reduce the time it takes to fetch data from the database.
Rate this post
3 of 5 based on 6130 votesComments