This article answers the following query:
How to identify expensive SQLs using ST04 transaction ?
As shown below, go to transaction ST04 and navigate to SQL statement analysis -> Shared cursor cache
Double-click on the shared cursor cache while leads you to the following screen.
In the above screen, please provide Buffer gets as 50,000 for example and in the List sort options select Buffer gets radio button and execute.
It displays all the SQL statements whose buffer gets are more than 50,000 and sorts them in descending of buffer gets as shown below
As highlighted in the above screen, identify the top 5 or 10 statements as per buffer gets. These are expensive or costly SQL statements.
Also look at executions column and figure out the value. If value is greater for executions it means that the SQL is frequently getting executed and it is worth tuning that statement.
In the above screen, click on explain icon (highlighted) which opens up another window as below
In the above window, you can view :
- Estimated costs to execute the SQL
- Estimated rows to be fetched
- Estimated CPU costs
- Estimated IO costs
Ideally, all of these parameter values should be low for an optimized SQL statement. For expensive SQL statements, these values will be of very high impacting system performance.
Also, in the above window find out whether FULL SCAN is present. If present it means indexing is not proper for the table involved in this SQL statement.
Please double click on the table name or index name which is going for full scan, to view similar window as below :
In the screen, you can note “Last statistics date”. If this date is too old, it is recommended
run update statistics for this table or index. This improves performance of the SQL statement as it provides correct recent statistics so that execution plans for SQL statement will be done properly.
Due to outdated statistics, execution plan won’t be proper leading to high row fetches, increasing CPU and IO costs.
25 comments:
Thanks a lot dear may u live longer and longer sharing sweeter and sweeter with us :)
Many thanks dear for your sweet comments :-)
ITS helping alot ..thnx for your posting ..Please continue more
I have read so many tutorials regarding basis but this blog is very nice .explained every thing with screen shots.And it is very useful for fresh candidates like me.And i wish you can put some other topics .
Thanks for posting such a nice article
Wow, Amazing blog.. Fallen in love with this blog, Keep posting more Durga. Hail Durga hail..
HI Joshua...Thanks a lot for your nice comments. It really motivated me to spend more time on the blog.
Nice post...Thank you so much for the share.
Thanks friend for your motivational comment
Superb, thx :)
super!great work
Very impressive way of explanation...
Thanks alot for your sweet contribution to SAP
I learned BASIS through this blog, if i get a chance to meet durga, really ill touch his feet for such a wonderful blog. i am follower of this blog from past 1yr, i learned a lot from this blog.
thankq durga sir.
Nice postings..very helpful for basis guys..
thanks a lot
Hello Durga Sir,
You are just amazing and the way of your explanation is awsome.
Sir PLease cover more and more topics so that it will help us to learn more.
Thanks alot for your blog which gives me lot of confidence and have learnt so many things from your blog. God bless you Sir.
Thanks a lot friend for your nice words. Will try to spend more time on blog and post more articles.
Hello Mr. Durga Prasad
I didn't understand why you took 50000 as buffer gets ... Can u please explain ?
Thank you.
thanks sir for sharing this information bindu
hi am swathi excellent post thank you
Thanks a lot for posting so many SAP articles. really helped me.
is it possible to you posting some SAP archiving articles in various SAP modules? That will be greatly appreciated.
Excellent information Latha Pallavi thank you very much
Carry on Durga! Super blog !
Kumar Mitra / Germany
Hi Sir,
Is it possible to know at which % the update stats were performed on the table last time. There are scenarios where the table may contain ~2 Billion records and updating full table scan is not feasible.
Hello Sir ,
Extremely thankful for all your information .. just one confirmation is required in this regards , while we are performing table reorg we should reorg these tables it would be beneficial for system performance , although update table statistics job is running regularly..
Once again thanks a lot for all your useful info ..
Hi DurgaPrasad Sir,
May i know your mail id. I have few doubts
Post a Comment