📜 ⬆️ ⬇️

From love to hate - one step, or how I stopped loving magic in ActiveRecord

Recently, an interesting problem arose in one of the projects - the data on the REST API was given for quite a long time, despite their small number. What happened and why - I tell under the cut.

The admin information was displayed from the database, 20 entries per page + links were made up. It took 50 (!!!) seconds. Sin was not to see what was happening with the base. I did not believe that with 50k records, about 6-7 joins to filter, and then 6-7 eager loading requests there could be such brakes.

So it was - all requests took about 0.18s, which is quite acceptable.

Okay, go dig further. And there was no limit to my outrage when I learned that all the time is spent on serializing models. How does this happen?
')
class OrderController { public function index(Request $request, OrderFilter $filter) { //   $query = $filter->applyFilters($request); //   return $query->paginate($request->input('count', 20)); } } 

Dispatcher begins to convert the result of the controller, depending on what the client wants. Of course, he saw the Accept: application/json header, and began his dirty work. And here the heat began.

For each model, for each connection, and further, many methods are recursively called - magic getters, mutators, castes.

The same sinister code
 /** * Convert the model's attributes to an array. * * @return array */ public function attributesToArray() { $attributes = $this->getArrayableAttributes(); // If an attribute is a date, we will cast it to a string after converting it // to a DateTime / Carbon instance. This is so we will get some consistent // formatting while accessing attributes vs. arraying / JSONing a model. foreach ($this->getDates() as $key) { if (! isset($attributes[$key])) { continue; } $attributes[$key] = $this->serializeDate( $this->asDateTime($attributes[$key]) ); } $mutatedAttributes = $this->getMutatedAttributes(); // We want to spin through all the mutated attributes for this model and call // the mutator for the attribute. We cache off every mutated attributes so // we don't have to constantly check on attributes that actually change. foreach ($mutatedAttributes as $key) { if (! array_key_exists($key, $attributes)) { continue; } $attributes[$key] = $this->mutateAttributeForArray( $key, $attributes[$key] ); } // Next we will handle any casts that have been setup for this model and cast // the values to their appropriate type. If the attribute has a mutator we // will not perform the cast on those attributes to avoid any confusion. foreach ($this->getCasts() as $key => $value) { if (! array_key_exists($key, $attributes) || in_array($key, $mutatedAttributes)) { continue; } $attributes[$key] = $this->castAttribute( $key, $attributes[$key] ); if ($attributes[$key] && ($value === 'date' || $value === 'datetime')) { $attributes[$key] = $this->serializeDate($attributes[$key]); } } // Here we will grab all of the appended, calculated attributes to this model // as these attributes are not really in the attributes array, but are run // when we need to array or JSON the model for convenience to the coder. foreach ($this->getArrayableAppends() as $key) { $attributes[$key] = $this->mutateAttributeForArray($key, null); } return $attributes; } 


Well, of course, mutators are a very handy thing. It is cool and beautiful to access various data in models / connections, but on the documentation page, developers were too lazy to write that their use has a significant (and I want to write a huge impact) impact on performance.

And here comes the understanding that the train has already accelerated very much, and there is no time to redo everything on datamapper / querybuilder. I stayed with ActiveRecord at the broken trough. I like this magic, but you can't abuse it.

In order not to break anything, I had to call Redis for help, which now contains all the data, and is updated regularly after updating the models. But it was not there! The amount of data is so great that Redis fell (a sin on me, perhaps it was necessary to podtyunit). I had to pass the data through gzcompress, because the standard 64MB is no good. And even a separate instance started to make sure that Redis will not fall under load anymore.

Now everything works, everything is fine. Data is given in less than 0.5s, everyone is happy. But I’m sitting and thinking, “Laravel is captivating with speed and simplicity, but the next project will definitely be without your ActiveRecord.”

Here and the fairy tale is over, and who listened - well done, they would avoid the bottlenecks.

Source: https://habr.com/ru/post/281493/


All Articles