Hi, I need to optimize some queries (postgresql used). One of them looks like the following query:
The query needs around ~320ms at the moment. The result of explain analyze is: If the above picture is too small:
The numbers resp. the IDs in the IN(...) part are arbitrary. They are generated during run time of the application. I tried to optimize the query by using a temporary table (called tmp_ItemIds). In the temporary table I saved the IDs and changed the original query to:
The query needs around ~50ms now. I'm not sure if this is the best solution. Can tell me somebody how I can optimize it in a better way and if using a temporary table is a appropriate solution? The tables item_features and colors have the following structure. As you can see two indexes are defined up to now: If I understand one of your articel right, it would theoretical help to change LIKE '%arbe%' to LIKE 'arbe%'? Thank's, Chris asked May 21 '14 at 20:59 kid34 Markus Winand ♦♦ |
Well, it would help performance, but also change the result. This statement was meant like an application design advice, not a tuning advice ;) Looking at the execution plan you posted, it is clearly visible where the time is lost:
So there is a full table scan on There are two ways to approach this:
So, there are two different approaches, each has might further benefit from an "on-top" optmization (index-only; second index). Your milage will vary based on your data. Try these ideas and let us know how it worked out. ps.: I think there are better options than temp tables, but let us know if the above suggestions don't work for you. answered May 27 '14 at 12:17 Markus Winand ♦♦ |