we create value for our clients
Addressing Performance Issues for a Leading London Estate Agency: A Case Study
In November 2024, a leading London estate agency reached out to Wirebox after encountering significant issues with their website. The problems included slow or unavailable property searches, frequent timeouts, and overall poor site performance. Interestingly, the organization was unaware of these issues, as Cloudflare’s “always-on” service was masking the timeouts.
Outdated Technology Stack: A Key Culprit
The agency’s website was built using Laravel 8, which was released on September 8th, 2020. However, Laravel 8’s bug fixes were only supported until July 26th, 2022, and its security updates expired on January 24th, 2023. This meant that the platform was running on outdated, unsupported versions with known vulnerabilities. As part of our initial assessment, we recommended updating their stack to the latest supported versions—or at the very least, a version still receiving official support.
We also introduced the idea of using Laravel Shift, a tool we recommend to all our clients. It’s an excellent way to ensure smooth, automated updates, and we use it in our managed services.
At the time of the assessment, Laravel’s latest version was v11, and Laravel v12 is expected to release in February or March 2025.

Front-End Issues with WordPress
On the front-end, the website relied on the open-source platform WordPress to display public content. While WordPress itself wasn’t a major factor in the performance issues, it was important to address any front-end concerns as part of the overall solution.
Investigating the Site’s Speed Issues
Before gaining access to the servers, our team had a few theories on why the website was performing poorly. Upon receiving access to the Laravel application, we conducted a thorough analysis.
We exported the Laravel files and set up a local environment to replicate the issues. The findings were eye-opening:
- Limited Indexing on Property Table: The database had very few indexes on the properties table—only the master ID and a geospatial index. With over 6,000 properties in the database, many queries were inefficient and time-consuming.
- Hardcoded Directory Paths: Several hardcoded directory paths in the code were preventing our tests from running in the local environment. Although not a major issue to fix (only about 8 lines of code), this indicated a lack of consistency and poor code quality.
- Inefficient Database Queries: More than 100 instances of database queries were being executed on columns without indexes. This forced the database to scan through all records (over 6,000 properties), which was incredibly inefficient and contributed to the slow performance.
Fixing the Database: Creating Indexes
Using SSH access, we ran migration scripts to add indexes to the database. This addressed some of the immediate query inefficiencies. However, we also discovered deeper issues within the code that exacerbated performance problems.
Code Quality Issues: A Deeper Look
Our analysis revealed several code quality issues that contributed to the website’s poor performance:
Repeated Coding Mistakes: In one block of code, paths were hardcoded incorrectly. Multiple developers had worked on the code, leading to inconsistencies and mistakes. For instance:
php
Copy
$destination = storage_path(‘app/public/images’) . DIRECTORY_SEPARATOR . $name . ‘.webp’;
$src = str_replace(‘/www/mandp-api/var/html/storage/app/public/’, ‘/storage/’, $destination);
return Storage::url(str_replace(‘/www/mandp-api/var/html/storage/’, ”, $destination));
- These inconsistencies appeared across multiple files, making it clear that there was no code review or quality assurance process in place.
- Improper Use of Laravel Blade: Laravel Blade templating was not used correctly, with PHP opening and closing tags appearing in templates (e.g., <?php //etc ?>), which is not the recommended practice.
- N+1 Query Issue: The code was suffering from the infamous “N+1 query problem.” This occurs when a single query is made to fetch all records, and then for each record, additional queries are made to retrieve related data. In this case, for every property returned in search results, the system was running an additional query to fetch its thumbnails. This led to excessive database calls and slower load times.
For example, if 50 properties were returned in the search results, the system would run 51 queries—one to retrieve the properties and 50 more to get their associated thumbnails.
Proof of Concept: Unit Testing for Speed Improvements
To better understand the impact of these issues, we set up unit tests to simulate how property searches were being handled on the website. Initially, the speed test showed that the search query was taking 8.39 seconds to execute—far too long for a smooth user experience.
We started by adding indexes to the database, which resulted in a slight speed improvement. But the main problem wasn’t just the database queries; we needed to dig deeper.
Further Investigations and Fixes
We continued debugging and discovered additional issues:
- Thumbnail Query Performance: The thumbnails table contained 164,000 rows, and there was no index on the property_id column. This meant that every time a property’s thumbnail was queried, the system had to search through all of these rows, adding unnecessary delays.
Hardcoded File Paths in MongoDB: We also found that thumbnail URLs were being stored in the MongoDB database with hardcoded file paths, such as:
ruby
Copy
https://api.client_name/retracted/var/www/storage/app/public/thumbnails/CSV154821_13.webp
- These paths needed to be normalised to something simpler, like /storage/thumbnails/CSV154821_13.webp, to improve consistency and performance.
After implementing these fixes in the local environment, we re-ran the performance tests.

Results: A 10x Speed Improvement
Once we addressed the database indexing issues and eliminated the redundant queries, the performance improved significantly. The search query, which previously took 8.39 seconds, now ran in just 0.77 seconds—over a 10x improvement!
While this was a great start, we believe there’s room for further optimization by utilising proper Laravel relationships, implementing eager loading, and addressing any remaining code issues.
Prior to analysis:
Post optimisation:
Our work with this estate agency highlights the importance of regular updates, thorough code reviews, and efficient database design in maintaining a fast, scalable website. By addressing key issues such as outdated software, inefficient database queries, hardcoded paths, and code quality problems, we were able to drastically improve site performance and provide a better user experience.
If you’re facing similar performance issues on your site, don’t hesitate to get in touch with us. We can help optimize your code, streamline your database, and ensure your website is running at its best.