Thursday, February 25, 2010

The Journey from Smart to Good

“It’s easier to be smart than good.”

I heard that quote in a recent sermon at church and it is one I can very much relate to in more than one area of my life.  In context the pastor was discussing the difference between knowing what the Bible says (being smart) and applying what the Bible says (being good) and all too often I am more smart than good in that area, that’s a topic for another blog, but really the quote can be applied to many areas, including being a SQL Server DBA and that’s what I want to talk about here.

All too often I’m smart, rather than good when it comes to being a DBA.  I allude to this in my 2010 Goals post in the Professional Development section goal #2.  I’m great at learning new things and always have been, but not as good at putting them into practice.  I KNOW a lot about SQL Server, but, for a variety of reasons (fear of failing, size of SQL Server implementations, etc…), I haven’t APPLIED that knowledge.  One other thing I think that comes into play for me, anyway, is the three levels of knowledge as discussed in this blog post (note:  while the ideas are good I don’t agree with some of the wording).  Basically it says there are three areas of knowledge:

  1. The things you know you know.
  2. The things you know you don’t know.
  3. The things you don’t know you don’t know.

The goal is to have area 3 be as small as possible and area 1 as large as possible.  My issue in regard to being a SQL Server DBA is that I have moved a big chunk of area 3 into area 2, which just makes me feel inadequate, but in reality that means I’m making progress.  The next step is to move some of area 2 into area 1.

So what’s the plan for going from smart to good?  I think the first step is to understand that there is a difference, then begin applying what you already know.  Then just continue to learn AND apply. 

Remember it’s a journey not a destination and the good DBA’s I know realize that you never really arrive. 

So where are you in the journey?

Monday, February 22, 2010

What’s up with the reads?

So the other day I read Andrew Kelly’s article, Do You Have Hidden Cache Problems, in the latest electronic edition of SQL Server Magazine, and decided to run the query he listed in the article to see if my server was suffering from this problem:

SELECT
SUM(single_pages_kb + multi_pages_kb) AS "CurrentSizeOfTokenCache(kb)"
FROM
sys.dm_os_memory_clerks
WHERE
name = 'TokenAndPermUserStore'


My result was 189344 kb.  In the article Andrew states that:




…if its under 10MB, you should be fine.  I’d start getting concerned between 10MB and 50MB.  If the amount of the consumed memory is over 50MB, you’re probably affected by this problem.




Well, my value converts to nearly 185MB!!!  I guess I must have this problem.  The article says this about the cause:




On a system with reuses query plans effectively, this cache would be as large as only a few megabytes,… But on a 64-bit system with lots of ad hoc or dynamic SQL queries, this cache can grow to hundreds of megabytes in size.




So apparently my lightly loaded server (1 main application on it right now) and a max of about 50 user connections has a lot of ad hoc or dynamic SQL hitting it.  Well, since the application is a third-party application I decided to run this query to see what I had in the procedure cache for ad hoc queries:



SELECT
DEQS.execution_count,
DEQS.total_worker_time,
DEQS.total_physical_reads,
DEQS.total_logical_reads,
DEQS.total_elapsed_time,
DECP.refcounts,
DECP.usecounts,
DECP.cacheobjtype,
DECP.objtype,
DEST.text
FROM
sys.dm_exec_query_stats AS DEQS JOIN
sys.dm_exec_cached_plans AS DECP
ON DEQS.plan_handle = DECP.plan_handle CROSS APPLY
sys.dm_exec_sql_text(DEQS.sql_handle) AS DEST
WHERE
DECP.objtype = 'ADHOC'
ORDER BY
DEQS.execution_count DESC


Hmmm, 4800+ rows returned.  As looked a little deeper I noticed that rows 10-12 based on execution count were the same query with different WHERE clauses.  The queries looked like this:



SELECT TOP 1
Column1
FROM
Table WITH
(NOLOCK)
WHERE
(Column2 = 'Column Default Value' Or
Column2 = 'Another Column Value') AND
(
Column3 = 'Column Default Value') AND
Column4 = 'Value' AND
Column5 = 'Value'
ORDER BY
Column2,
Column3


Interesting.  Then I looked a little further down the list and found some more of the same query.  So I reran my query adding WHERE text Like '’%KeyWord1%KeyWord2%KeyWord3%’ and found over 600 rows for this “single” query in the cache.  Now, those of you who read my blog know that I am a believer in stored procedures for data access and if you aren’t using stored procedures then use parameterized SQL.  This is a great example of a query that needs to be encapsulated into stored procedure or parameterized.



It gets even better.  I took one of the queries and ran it to see what it returned.  It returned nothing! So I ran SELECT * FROM Table to see what was in the table, nothing at least at that time!  The fun part is that running the application query does a Clustered Index Seek, but takes 2 scans to do it according to STATISTICS IO and the SELECT * takes does a Clustered Index Scan but only scans the index once according to STATISTICS IO.  So I looked at the clustered index on the table and it is (based on my query):



    Column2, Column3, Column4, Column5, Column6



So the only column in the table not included in the clustered index is Column1 (the value being returned).  As I looked at the query, it hit me.  There is an OR in the where clause, so in the Query Plan you have 2 SEEK predicates thus 2 scans of the index.  If you remove the OR you get one SEEK predicate and 1 scan.



So what does all this mean?  Well, for me it solidifies 2 things:




  1. Use stored procedures or at the least parameterized SQL.


  2. OR’s are bad and should be avoided as much as possible



So what am I going to do?  I honestly don’t know for sure.  Because there are no complaints about performance I don’t feel the need to address the TokenAndPermUserStore issue and I’m not sure what I’d do anyway besides possibly using Forced Parameterization.  I think I will contact the vendor (they’ve been good to work with so far) to find out what the query really does and to find out why it isn’t parameterized or in a stored procedure since most of their other stuff is.  I’m open to suggestions.

Thursday, February 18, 2010

SQLSaturday Acquired by PASS – Reactions

The Announcement

On February 2nd, PASS (@sqlpass), here, and Andy Warren (@sqlandy), here, announced ownership of the SQLSaturday franchise brand was transferred to PASS.  Since I am friends with Andy I know that this is something that he and his partners (Steve Jones (@way0utwest) and Brian Knight (@BrianKnight)) have wanted to do for awhile. 

The Reactions

After the announcement there were several posts in reaction to the acquisition (if I missed any, please add them in the comments):

  1. Andy Warren Explains the move - SQLSaturday & PASS – More Details
  2. Andy Leonard (@AndyLeonard) interviews Andy Warren including the reason for the move - Interview with Andy Warren about SQL Saturday, PASS, and More
  3. Steve Jones gives his take - SQLSaturday Bequeathed to PASS
  4. Marlon Rubinal (@marlonrubinal) – is very positive about the change - SQL Saturday Acquired By PASS
  5. Tim Ford (@sqlagentman) takes a more skeptical view of the move - A SQL Saturday Sequel: Van Halen or Van Hagar?
  6. Tim Mitchell (@tim_mitchell) is positive, but also had some concerns - The PASS Acquisition of SQL Saturday

The Reason

Having been a co-organizer of SQLSaturday #21 – Orlando I have some understanding of what it takes to put on an event (Andy really did most of the work so I don’t understand all of it).  Plus, being friends with Andy and seeing him regularly I also have some insight into the amount of work he put into helping events and maintaining/improving the tools (web site).  Based on this I understand why Andy, Steve, and Brian would want to pass the administration of SQLSaturday to PASS, an organization with event experience and a full-time staff.  Remember, SQLSaturday was not a profit center for Andy, Brian, and Steve, any time Andy (or either of the others) put into SQLSaturday was time not spent trying to make money.

My Take

You can see in my post, What Should Pass Strive To Be?, under point 3 that I believe PASS should be supporting/administering regional events like SQLSaturday.  Based on that I believe that from the perspective of PASS this acquisition is a great thing.  Having said that I understand and, really, agree with the concerns Tim Ford and Tim Mitchell raise.  I think that if PASS remains true to the founding principles of SQLSaturday which I understand to be:

  1. Provide a day of quality, free SQL Server focused training that is organized and administered by local people.  Key word free.  Yes, some events have charged for lunch, but getting that lunch is optional. This is facilitated by providing:
    1. An event framework with a minimum of rules
    2. An administrative application (SQLSaturday web site) for registration; accepting, approving, and scheduling sessions; messaging for sponsors, speakers, and registrants; and processing funds collected from sponsors and lunch fees
  2. Provide another outlet for local speakers.
  3. Provide a growth path for local speakers.  Chapter session –> SQLSaturday –> National/International Conference

and builds upon them then this will be great for both parties, if not then I think it is likely that previous local SQLSaturday organizers will start organizing non-PASS branded events to continue the spirit of these principles.  PASS needs to understand that many people in the SQL Server community are committed to these principles.

What Can/Should PASS Do to Make SQLSaturday Better?

To reference my, What Should Pass Strive To Be?, post again (point 4), I think one areas PASS can help is to develop regional leadership teams that assist the local Chapter Leaders in organizing a SQLSaturday.  For example, in the southeastern United States you have Andy Warren, Pam Shaw, Stuart Ainsworth, Karla (Remail) Landrum who have all done or by the summer will have done multiple events, use their experience to help others.

Provide assistance in contacting sponsors.  Chapters tend to lean on local companies and recruiters who have been great supporters of the events thus far, but the size alone of the event means that they need more than that and organizers may or may not have that contact information.

Continue to enhance the administrative tools provided via the SQLSaturday web site.  One of the things I see as missing is volunteer management, and I did mention it to Andy.  The session scheduling application could be tweaked to schedule volunteers.  This has 2 benefits:

  1. The volunteers know when and what they will be doing ahead of time so they can schedule their day appropriately.
  2. The organizers will see where they need to fill in gaps and make better use of the volunteers’ time. 

Conclusion

I really think this change makes sense.  PASS has tried to do community events with the Community Connection branding, but struggled to get it off the ground due to lack of administrative tools and the fact that SQLSaturday was out there and had the tools.  So now they have the tools and hopefully, the will, to grow the event.

Tuesday, February 16, 2010

SQL Developers Please Use the Table Name/Alias to Prefix Columns

Note:  After I completed this post Aaron Bertrand added this subject to his “Bad Habits to Kick” series for myself and Jay as we suggested on Twitter.  I wrote this post because I knew Aaron was headed to the Winter Olympics so I didn’t think he’d get to it for few weeks and I needed an idea.  Thanks for getting to it, Aaron.
This post is almost an extension of Aaron Bertrand’s (@AaronBertrand) excellent “Bad Habits to Kick” series (if you aren’t reading his blog you should) and was inspired by this tweet:
image
Basically Jay was looking at a query like this (hard for me to write in SSMS because of SQLPrompt):
SELECT
Sales.SalesOrderHeader.SalesOrderID,
OrderDate,
ShipDate,
Status,
Sales.SalesOrderHeader.TerritoryID,
SubTotal,
TaxAmt,
Freight,
TotalDue,
Comment,
OrderQty,
UnitPriceDiscount,
LineTotal,
SP.SalesPersonID,
SP.TerritoryID,
SalesQuota
FROM
Sales.SalesOrderHeader JOIN
Sales.SalesOrderDetail AS SOD
ON Sales.SalesOrderHeader.SalesOrderID = SOD.SalesOrderID JOIN
Sales.SalesPerson AS SP
ON Sales.SalesOrderHeader.SalesPersonID = SP.SalesPersonID

Now, for the original person writing this code it makes perfect sense and they know which column belongs in which table, but, if there are performance issues and you call someone in to help, they won’t know what tables the columns belong to without looking at the schema.  As a matter of fact, the original author likely won’t remember the tables the columns belong to 6 months later either.  In my opinion the above query should look like this:




SELECT
SOH.SalesOrderID,
SOH.OrderDate,
SOH.ShipDate,
SOH.Status,
SOH.TerritoryID,
SOH.SubTotal,
SOH.TaxAmt,
SOH.Freight,
SOH.TotalDue,
SOH.Comment,
SOD.OrderQty,
SOD.UnitPriceDiscount,
SOD.LineTotal,
SP.SalesPersonID,
SP.TerritoryID,
SP.SalesQuota
FROM
Sales.SalesOrderHeader AS SOH JOIN
Sales.SalesOrderDetail AS SOD
ON SOH.SalesOrderID = SOD.SalesOrderID JOIN
Sales.SalesPerson AS SP
ON SOH.SalesPersonID = SP.SalesPersonID

I know without going anywhere else what table each column belongs to and it is consistent.  I even think the Aliases meet with Aaron’s standard.


I know I can be a bit OCD with formatting T-SQL, but I like to be able to read the code quickly and at a glance.  See this post for my standards (I’ve actually slightly changed because SQLPrompt doesn’t, at least not that I’ve found, do EXACTLY what I like).


What do you think?

Wednesday, February 10, 2010

February OPASS Meeting Recap

Introduction

Last night we had our February OPASS meeting with Ron Dameron (@RonDBA) presenting on Database Hardening using PowerShell.  We broadcast the main presentation using Live Meeting and this time I remembered to hit record, so the session will be available on the OPASS web site at some point.

Attendance

We had a decent turnout of around 20 which is around our average,   but this was lower than we expected because we had a lot of RSVP’s come in.  There were a couple of reasons for the lower than expected turnout, weather and traffic.  It was pouring at the start time of the meeting which probably caused some folks to decide to go home.  If you had to come through Orlando to get there the traffic was backed up everywhere because of accidents, likely due to the weather.  So we lost some attendance and had some late arrivals.  We also had 6 remote attendees and we were a little disappointed that the folks who didn’t make it in person didn’t attend the Live Meeting.

Announcements

We started the meeting with announcements and the chapter deck from PASS.  There was a lot of information in the deck and, in my opinion it could have been trimmed down a bit.  I hit the highlights and moved fairly quickly through them.  I added a couple of slides to the deck for local announcements and, on my laptop, a white font on the background was hard to read on the right-side and black font was readable so I went with the black font.  When I got it up on the screen the black font was hard to read on the left-side.  I needed to take a bit more time with that, as I certainly didn’t want to have a second deck just for my couple of slides.

Networking

We then had a longer than anticipated informal networking side where we discussed the recent Tampa SQLSaturday, the transition of SQLSaturday to PASS ownership, and other PASS topics.  The delay was because we were waiting for the Pizza to arrive, apparently a 10 pizza order is tough to fill on time (see Andy Warren’s recap for more).

Featured Presentation

The featured presentation was Ron describing and demonstrating how he is using PowerShell to automate common tasks and to fulfill requests for audit type information (server settings, users, etc…).  He’s on a team that is charged with automating, standardizing, and optimizing DBA tasks for each of their database platforms and as the SQL Server DBA on that team his responsibility is the ~600 SQL Servers (Servers not databases) in the company.  Without Powershell some of the tasks he has had to do would be nearly impossible to do.  It was good session that shows the power of PowerShell in a large environment and he also did a good job sharing how he learned PowerShell.

The Raffle

We finished up by raffling off some SWAG and it was nice that each of our first-time attendees won something!  It’s always nice to reward new people with some nice SWAG! 

Post-Meeting

After the meeting several people hung around for an hour or so to talk shop and to get to know each other better.  I think the networking aspect of our group is getting better and people are seeing the value of building relationships in the group.

Andy (@SqlAndy), Kendal Van Dyke (@SQLDBA), and I looked at an interesting issue Kendal had seen at work and not been able to solve.  We didn’t solve it either, and, hopefully, we’ll see a nice blog post from Kendal about it.   We also talked about PASS and about growing speakers and providing a path for growth and how PASS can help newer speakers get out to more events.  We ended up staying way too late, but it was great discussion.

Personal Conclusion

I say this often, but, if you have access to a user group and aren’t going, you need to start.  Honestly, I felt horrible and wanted to go to bed when it was time to head to the meeting, but I felt great AFTER the meeting.  I’m almost always energized and refreshed after attending a user group meeting because I get excited again about what I get to do by hearing about what other people are doing.

Tuesday, February 9, 2010

February OPASS Meeting Tonight

It's here!  Our February meeting is tonight at 6pm at End To End Training, 225 S. Westmonte Drive, Suite 2010, Altamonte Springs, FL (see the OPASS web site for directions).  While we would love for you to be there in person, the presentation will also be available on-line via Live Meeting at https://www.livemeeting.com/cc/mvp/join?id=24PPGB&role=attend.

Speaker: Ronald Dameron

Ronald Dameron is a Senior Database Administrator for the largest life insurer in the United States. He is most comfortable with Microsoft SQL Server as a database developer and DBA. He is currently exploring how PowerShell can simplify his life as a DBA and the new features of SQL Server 2005/8. Follow him on Twitter at @RonDBA and at his blog at http://ronalddameron.blogspot.com.

20 years of IT experience on a wide range of platforms to include: Mainframes (Yes, he does know what JCL is.) and Windows Server systems.
He now chants Standardization, Optimization, Automation in his sleep.

Topic: Database Hardening via PowerShell.

Ron will demonstrate how he uses PowerShell to handle the issues encountered in a Fortune 50 corporate environment that has over 500 database servers and 3600 databases.

Please email webmaster@opass.org to RSVP for in person attendance.

Thursday, February 4, 2010

SSMS and Visual Studio Tip

Did you know that if you right-click on a tab in SSMS or Visual Studio you get the option to “Close all But this”?

CloseAllButThis

Maybe everyone else already knew this, but at the January OPASS meeting Andy Leonard (@AndyLeonard) used this and it was the first time I had seen it.  I use it almost every day now.

Wednesday, February 3, 2010

DBCC CHECKDB() on ResourceDB?

A few days ago I was answering a forum post about CHECKDB() and as part of the research I checked the Windows Application Log.  I was surprised to see this entry:

Event Type:    Information
Event Source:    MSSQLSERVER
Event Category:    (2)
Event ID:    8957
Date:        1/30/2010
Time:        5:45:02 PM
User:        domain\username
Computer:    ServerName
Description:
DBCC CHECKDB (mssqlsystemresource) WITH all_errormsgs, no_infomsgs, data_purity executed by domain\username found 0 errors and repaired 0 errors. Elapsed time: 0 hours 0 minutes 0 seconds.

I’m using Ola Hallengren’s backup and maintenance scripts to do my integrity checks, but you can’t run DBCC CHECKDB() against ResourceDB, can you?  Well, just to verify I tried it and got this message:

Msg 2520, Level 16, State 11, Line 1
Could not find database 'mssqlsystemresource'. The database either does not exist, or was dropped before a statement tried to use it. Verify if the database exists by querying the sys.databases catalog view.

So where did the application log come from?  As usual BOL has the answer in the DBCC CHECKDB() entry:

Because the Resource database is modifiable only in single-user mode, the DBCC CHECKDB command cannot be run on it directly. However, when DBCC CHECKDB is executed against the master database, a second CHECKDB is also run internally on the Resource database. This means that DBCC CHECKDB can return extra results.

So while you can’t directly run the command SQL Server is taking care of it for you when you run it against the master database.

The question is, are you doing anything to BACKUP the Resource Database