Thursday 31 December 2009

2009

I never wrote down my goals for 2009. Not that I didn’t have any. I’m always learning and often worry about what I don’t know rather than what I do. I'm not sure why, I just do!  My 'to do' list spans a google docs spreadsheet and several notes of personal projects, the majority of which are sql based.


Development
A project to rewrite processing and reporting in an Enterprise Infrastructure is my baby. After a departure from the team, the project became my own. Having planned and documented the stages of the project, I began the frustrating learning curve that is SSIS. I learnt a lot of SSIS (Integration Services) on the fly out of necessity. This was through books, websites, blogs and trial and error.
I utilised VB.NET 2008 for CLR functionality too (something I have only dabbled in before). This was very successful in terms of improving the performance for 2 scalar functions.

Administration
DBA wise, I migrated all environments to SQL 2008 Enterprise
This left me free to investigate and implement Enterprise features,
Certifications

Half way through the year I decided to attempt some certifications. The result was obtaining MCITP : Database Administrator 2008 in September and MCITP : Database Developer 2008 in October. 
Both tracks involved passing 2 exams, first the MCTS (Technical Specialist) then the MCITP.

Sunday 27 December 2009

I/O Query - Average Stall times per database

Adapted from Glenn Berry's Average Stalls query
SELECT
DB_NAME(database_id) AS [Database Name]
,AVG(CAST(io_stall_read_ms/(1.0 + num_of_reads) AS NUMERIC(10,1))) AS [avg_read_stall_ms]
,AVG(CAST(io_stall_write_ms/(1.0+num_of_writes) AS NUMERIC(10,1))) AS [avg_write_stall_ms]
,AVG(CAST((io_stall_read_ms + io_stall_write_ms)/(1.0 + num_of_reads + num_of_writes) AS NUMERIC(10,1))) AS [avg_io_stall_ms]
FROM sys.dm_io_virtual_file_stats(null,null)
GROUP BY DB_NAME(database_id)
ORDER BY DB_NAME(database_id)


Link : Glenn Berry's Performance Queries

Tuesday 22 December 2009

Agile Development - Anything but !

Firstly, Agile must work. I'm sure some companies run it very well and reap the benefits.
The business gets flexibility and visibility of technical projects. The adhoc nature of being able to get on without a massive upfront cost of planning, meetings and documentation is surely a developer's dream?

In the interests of impartiality, here is an excellent video by Axosoft on how it is supposed to work.

Why I am currently unconvinced is based on one company's painful implementation of scrum and the epic volume of time wasting that ensued. The introduction of 'Agile' has been a real eye-opener. It has clearly shown the true colours of people wallowing in implementing processes, arguing over whether we are doing things 'the Agile way'. Rather than enabling the team to self manage it has become micro-management, an irony as it is anything but 'Agile'.

I'm pretty sure everyone is to blame for the scenario -
  • Managers for wanting stuff 'now' and circumventing the processes.
  • Developers for not pushing back and communicating lost time.
  • Everyone for wanting work done quickly (and being prepared to circumvent testing and qa processes).
We're mostly very concious of the creation of technical debt, but this in itself is leading to a negative vibe towards quality. This negativity has propogated from attitudes to discussions and has been self  fulfilling in terms of code quality.  Every day has become a fight to be allowed to engineer elegent technical solutions

As a Database Professional the standard answer is 'It depends'. That's not to be awkward, it's the truth.
You can have your data now as a one-off run or automated daily in a process.
How effeciently that process runs can have many levels of optimizations
  • Have we chosen data types wisely?
  • How well is the code written?
  • Is it secure?
  • Does it scale? Today's data volume and execution plan will not be valid a year from now.
  • Is the hardware adequate?
  • Is it configured optimally?
Scrum doesn't discourage planning it merely timeboxes it. Our implementation heavily encourages quick wins and the prioritization of asthetically pleasing output over well thought out solutions. In many ways getting everyone together to discuss the next project helps with the realism of how much will be omitted by releasing at the end of a 2 week cycle. To be fair, increasing technical debt due to quick wins can happen anywhere and can not totally be blamed on 'Agile'.

The agility of the team is majorly hampered by other factors too, namely -
  • Domain masters. Members of the team who exclusively deal with certain projects/code.
  • Zero documentation. 
  • Minimal code commenting
  • Superficial testing (usually only of visual aspects)
  • A build/release process that only a couple of senior developers understand.
    (Old issues reappear at every release due to extremely poor release management).
The main benefit of Agile is that it has provided is a high level of business visibility (everyone likes to be in the know).

The cost for this has been extremely high however.
For the first 6 months the number of people involved at all stages has been ridiculous.
  • The entire team (originally numbering 16) partiticating in 'poker planning', ironically a time estimating task conducted first in story points and eventually (after much debate) in hours.
  • The entire team doing Sprint Planning (breaking down tasks into a painfully stupid level of detail)
  • The entire team doing Sprint 'reviews'  (self congratulatory bleugh...)
The above meetings happen EVERY 2 WEEKS. Rarely do we make a fortnightly release, surely the point of the cycle!

Training is a major point where I feel we have fallen down. Rather than train the team, some senior members attended training and eventually 'certified' as ScrumMasters (theres a parallel with a Rugby 'scrum' here in terms of the style in which early meetings were conducted). 1 informal overview session of knowledge transfer occured and subsequent to that, each planning meeting was used to ram home 'scrum'. The lack of control (and initially timeboxing) of these sessions encouraged a 'free for all' debate that regularly took twice as long as it should. The relevence of material did not match the skill sets and the number of attendees far in excess of a quorum.

The upshot of this was that one team member became ScrumMaster and 'supported' the team for 2 weeks (support encompassed a range of skills around office comedy, Twitter and Youtube, with the useful addition of enquiring as to our progress daily). The Agile tool itself became a hot issue with the team switching repeatedly between an online planner, shared spreadsheet and finally a flip chart to show progression.

Fortunately for me it (eventually) became obvious that I was shoehorning my work into the scrum process for the sake of it. Poker Planing is irrelevent when you're the only one working with a technology and therefore estimating it. Estimating has been largely guesswork due to the SSIS learning curve (or mountain). The same reasons apply for omitting detailed discussions around implementation and the planning process.

The DBA role (including a significant amount of  ETL development) is now considered a SysAdmin one and I now sit outside of Scrum, acting in consultancy to when necessary.

By not practicing 'Agile' doesn't mean I'm not agile, it doesn't mean I cannot be flexible and respond quickly to well thought out requests. As a DBA I'm very reactive when I need to be. Admittedly the attention I pay to planning, implementing and monitoring my environment means that those surprises occur rarely.

Time is very important to me and this is my principle reason for opposing the scrum alliance. The man hours wasted in these meetings is frustrating for the technically driven, let alone anyone striving to achieve a systems nivarna. Anyone with the slightest sense of pride in their productivity ends up making up this shortfall from personal time.

I do genuinely hope that I do see Agile work well one day, but until then...

Agile : A selection of reading -

Chickens, Pigs, and Really Inappropriate Terminology


Google : Anything but waterfall

Top Reasons To Not Go Scrum/Agile

The decline and fall of Agile

The top 10 things I hate about Agile projects


The secret skill revealed in the rusty booklet

Bookmark : Custom Alert Scripts for Transactional Replication

There is such a wealth of great material coming out of the SQL community now.

This from Kun Lee on replication is excellent...

Customised Alerts for Transactional Replication

Monday 21 December 2009

New Site - BIDN

I seem to be endlessly posting links nowadays. There WILL be some real content soon! (have about a dozen posts in draft at the moment).

Anyway, BIDN is Brian Knight's new site over at BIDN.com

It's short for Business Intelligence Developer Network and looks to be a promising community site, a bit like SQL Server Central but for BI developers.

Wednesday 16 December 2009

SSIS : Accessing Variables

During some recent SSIS package development I had cause to access a variable I had defined from scripts in both the Control and Data flows.

So that I remember (and for anyone else that cares) here is the subtle difference in syntax.
(I'm using SSIS 2008)

For the Script Task (within the Control flow) >

1) Add your defined variable to the ReadWriteVariables property of the task.
2) Within the script, refer to the variable like this :

Dts.Variables("User::MyVariable").Value = “New Value”


Within a Script Component (inside a Data flow) >

1) Add the variable to the ReadWriteVariables property of the component.
2) Within the script, refer to the variable like this :

Me.ReadWriteVariables("User::MyVariable").Value = “New Value”

Bookmark : How to build and maintain a tiered WSUS infrastructure

Could have done with this a couple of months back...

How to build and maintain a tiered WSUS infrastructure

Monday 7 December 2009

SSIS : Execute SQL Task for Dynamic Update Statements

Using the SSIS Execute SQL Task for updates

1) Create a Variable to hold the SQL command
2) Set EvaluateAsExpression to TRUE
3) Set the Expression (example below)

"UPDATE  [Audit].[ImportProgress] SET LastProcessedID = " +(DT_STR, 12,1252) @[User::LastProcessedID]  + " ,  LatestDate ='"  +(DT_WSTR, 20)(DT_DBTIMESTAMP) @[User::LastDateTime]  +  "'  WHERE ETLProcess = 1"

4) Add an 'Execute SQL Task' task to the control flow.
5) Set the SQLSourceType property to Variable
6) Set the SourceVariable property to the variable name specified in step 1

Sunday 6 December 2009

Wednesday 2 December 2009

SQL 2005 Express Edition - No Sql agent!

I've never had a cause to look at the Express edition of SQL 2005 until today.

My suprise finding (and reason for this post) is that there is no SQL Agent!
Google brings back no end of third party solutions however...

http://www.microsoft.com/Sqlserver/2005/en/us/compare-features.aspx

Monday 23 November 2009

Powerpivot ???

I was recently privileged to see MS BI Architect Donald Farmer demonstrate Project Gemini (now PowerPivot) at SqlBits V.

Basically PowerPivot is Excel on Steroids, enabling you to define datasets as tables on the fly and join them.

Empowering you to manipulate millions of data rows on a desktop pc, it achieves this using the same compression technology used in SQL 2008.

I'll certainly be looking out for this in SQL 2008 R2 / Office 2010.


The best analogy for PowerPivot so far...

Saturday 21 November 2009

SQLBits V

I attended the excellent (and free) SQLBits conference this weekend.
A lot of great speakers, and an overload of information! I also got in for free on the SQL/Windows 2008 R2 day , thanks to Andrew Fryer @ Microsoft throwing a free ticket my way.

It was interesting to see the current fad that is twitter heavily in use, with attendees using the hashtag #sqlbits to provide instant feedback and meet up. Iphones and tweeting have infiltrated the sql community far more than i had realised.

I saw a range of speakers and have come away with multiple notes to add to my task list and future goals. The majority of sessions were filmed so hopefully the material will be available shortly at the sqlbits website. The speakers I saw are listed below with links to their blogs.

Keynote and Powerpivot demo
Donald Farmer

When a query plan goes wrong
Simon Sabin

SSIS in SQL Server 2008
Allan Mitchell

T-SQL Tuning
Brent Ozar

Powershell – old tricks for a new dog
Martin Bell

Optimistic Concurrency Internals
James Rowland-Jones

SQL Server optimization stuff you won’t find on Google (yet)
Andre Kamman

In addition to the SQL & BI content there were lunchtime sponsor sessions by many vendors. On both Friday and Saturday I attended 2 excellent sessions presented by Mark Whitehorn, explained much better here by Ash.

Tuesday 17 November 2009

SQL Service Account Settings Summary

A quick note and screenshot to remind me of those all important permissions I've granted to my SQL Service account in Active Directory.

"Act as part of the operating system"
"Adjust memory quotas for a process"
"Bypass traverse checking"
"Lock pages in memory"
"Log on as a batch job"
"Log on as a service"
"Perform volume maintenance tasks"
"Replace a process level token"
"Allow log on locally"

Monday 16 November 2009

Bookmark : Top 10 Free Windows tools for IT pros

The Top 10 Free Windows tools for IT pros
http://www.infoworld.com/d/windows/top-10-windows-tools-it-pros-792?page=0,1

1) Sysinternals - Advanced Windows Diagnostics
2) HWiNFO32 - CPU Information
3) CCleaner - Registry and file system cleaner
4) Recuva - Recover deleted files
5) FileZilla - FTP Client & FTP Server
6) Virtual CloneDrive - Mounts ISO images as drives
7) Defraggler - Disk Defragmentation tool
8) ImgBurn - Burn ISO Images to disk
9) 7-Zip - Free Open Source alternative to Winzip
10) VirtualBox - Free Virtualisation Software

Changing Windows Password in Windows 2008 RDP Session

(because Alt-Ctrl-Del operates on the host!)

Start > ‘ Windows Security ’

Then you are presented with ‘ Lock / Log off / Change a password... / Task Manager ’


Simples...

Noteworthy : Career Links (December 2009)

Career :

How to survive in IT (the soft evil skills)

Understanding your manager

Agile :

Chickens, Pigs, and Really Inappropriate Terminology

Sunday 15 November 2009

SQL 2008: Modifying Resource Governor's workload classification

How to change Resource Governor Classifier function on the fly...
Here's the TSQL ...

-- 1) Disable Resource Governor
ALTER RESOURCE GOVERNOR  DISABLE 
GO

-- 2) Remove the classifier function from Resource Governor so that you can change it
ALTER RESOURCE GOVERNOR
WITH ( CLASSIFIER_FUNCTION = NULL)
GO

-- 3) Change the function itself
ALTER FUNCTION ResourceGovClassifier()
RETURNS SYSNAME WITH SCHEMABINDING
BEGIN
DECLARE @classification VARCHAR(32)

IF SUSER_SNAME() LIKE 'Dev_%'
SET @classification = 'Workload_Developer'

IF SUSER_SNAME() LIKE 'Adm_%'
SET @classification = 'Workload_DBA'

IF SUSER_SNAME() = 'applogin' 
SET @classification = 'Workload_App'


RETURN @classification 
END
GO

-- 4)  Set Resource Governor to use the revised function
ALTER RESOURCE GOVERNOR
WITH ( CLASSIFIER_FUNCTION = dbo.ResourceGovClassifier)
GO

-- 5) Re-enable Resource Governor
ALTER RESOURCE GOVERNOR RECONFIGURE
GO

Friday 13 November 2009

TSQL : Row count & Total Rows by OVER

Demonstrating OVER, PARTITION BY and ROW_NUMBER to return total number of rows and a incrementing row count.
USE AdventureWorks
GO
SELECT 
COUNT(*) OVER(PARTITION BY NULL) AS TotalRowCount 
,ROW_NUMBER() OVER (ORDER BY [TransactionID] DESC) AS [ASC_ROWNUMBER]
,[TransactionID]
,[ProductID]
,[ReferenceOrderID]
,[ReferenceOrderLineID]
,[TransactionDate]
,[TransactionType]
,[Quantity]
,[ActualCost]
,[ModifiedDate]
FROM [AdventureWorks].[Production].[TransactionHistory]
GO 

Thursday 12 November 2009

VB.NET / SQL CLR / Decode encoded URL

Immersing myself in VB for SQL CLR functionality I found this to Decode an encoded URL.

I've made one minor change from chr$ to chr to make it complient for .NET 3.5

Private Shared Function URLDecode(ByVal txt As String) As String
Dim txt_len As Integer
Dim i As Integer
Dim ch As String
Dim digits As String
Dim result As String

result = ""
txt_len = Len(txt)
i = 1
Do While i <= txt_len
' Examine the next character.
ch = Mid$(txt, i, 1)
If ch = "+" Then
' Convert to space character.
result = result & " "
ElseIf ch <> "%" Then
' Normal character.
result = result & ch
ElseIf i > txt_len - 2 Then
' No room for two following digits.
result = result & ch
Else
' Get the next two hex digits.
digits = Mid$(txt, i + 1, 2)
result = result & Chr(CInt("&H" & digits))
i = i + 2
End If
i = i + 1
Loop

URLDecode = result
End Function

Monday 9 November 2009

Bookmark : How To Create tables in Blogger via Excel

This link shows how to apply a style to a DIV and prevent tables expanding.
http://www.mt-soft.com.ar/2008/12/06/how-to-create-tables-in-blogger/

I found the easiest way to generate a clean table structure (i.e. without the styling info that a MS Word export provides) is to create a table in Excel by making up the row information in a formula.

for example >

=""&B1&"" - for each cell
=""&F1&G1&H1&I1&"" - for the enclosing row

Saturday 7 November 2009

SSIS : Expression Operators

Cast (MSDN)Data Type Conversion
() Parentheses, Evaluation order of calculation
+Addition when Numeric
+Concantenation when Strings
-Subtraction
-Negation (if signed data type)
*Multiplication
/Division
% Modulo
||Logical OR
&&Logical AND
!Logical Not
|Bitwise Inclusive OR
^Bitwise Exclusive OR
&Bitwise AND
~Bitwise Not
== Equal
!=Unequal
>Greater Than
<Less Than
>=Greater Than or Equal To
<=Less Than or Equal To
(condition) ? (true_value) : (false_value) (MSDN)Boolean Condition

SSIS 2008 / SQL 2008 / .NET Data Types







































SQL ServerSSIS.NETSQL CLR (System.Data.SqlTypes namespace)
bigintDT_I8Int64SqlInt64
binaryDT_BYTESByte[]SqlBytes, SqlBinary
bitDT_BOOLBooleanSqlBoolean
charDT_STRNoneNone
cursorNoneNoneNone
dateDT_DATEDateTimeSqlDateTime
datetimeDT_DBTIMESTAMPDateTimeSqlDateTime
datetime2DT_DBTIMESTAMP2DateTimeSqlDateTime
decimalDT_DECIMALDecimalSqlDecimal
floatDT_R8DoubleSqlDouble
imageDT_IMAGENoneNone
intDT_I4Int32SqlInt32
moneyDT_CYDecimalSqlMoney
ncharDT_WSTRString, Char[]SqlChars, SqlString
ntextDT_NTEXTNoneNone
numericDT_NUMERICDecimalSqlDecimal
nvarcharDT_WSTRString, Char[]SqlChars, SqlString
realDT_R4SingleSqlSingle
rowversion
Byte[]None
smalldatetimeDT_DBTIMESTAMPDateTimeSqlDateTime
smallintDT_I2Int16SqlInt16
smallmoneyDT_CYDecimalSqlMoney
sql_variant
ObjectNone
table
NoneNone
textDT_TEXTNoneNone
timeDT_DBTIME2TimespanTimeSpan
timestampDT_BYTESNoneNone
tinyintDT_UI1ByteSqlByte
uniqueidentifierDT_GUIDGuidSqlGuid
varbinary DT_BYTESByte[]SqlBytes, SqlBinary
varcharDT_STRNoneNone
xmlDT_WSTRNoneSqlXml



Sources :
Mapping SQL Server 2008 data types to SSIS
Mapping SQL Server 2008 data types to .NET / CLR

Friday 6 November 2009

SSIS : Credential & Proxy

1) Setting up a credential -

USE [master]
GO
CREATE CREDENTIAL [SSIS_Credential] WITH IDENTITY = N'Domain\SQLServiceAgent', SECRET = N'longcomplicatedpassword'
GO

2) Setting up a proxy -

USE [msdb]
GO

EXEC msdb.dbo.sp_add_proxy @proxy_name=N'SSIS_Proxy',@credential_name=N'SSIS_Credential', 
  @enabled=1

GO

3) Assigning proxy to be available to schedule SSIS jobs -

USE [msdb]
GO
EXEC msdb.dbo.sp_grant_proxy_to_subsystem @proxy_name=N'SSIS_Proxy', @subsystem_id=11
GO

NB : A subsystem_id of 11 is for SSIS package execution.


Database Journal : Proxy Accounts in SQL Server


Or all steps in one go -

USE [master]
GO

CREATE CREDENTIAL [SSIS_Credential] WITH IDENTITY = N'Domain\SQLServiceAgent', SECRET = N'longcomplicatedpassword'
GO

USE [msdb]
GO

EXEC msdb.dbo.sp_add_proxy @proxy_name=N'SSIS_Proxy',@credential_name=N'SSIS_Credential', 
@enabled=1
GO

EXEC msdb.dbo.sp_grant_proxy_to_subsystem @proxy_name=N'SSIS_Proxy', @subsystem_id=11
GO

Thursday 5 November 2009

Off-topic : Social Media

I use RSS, Twitter etc to follow the SQL Server community.

Here is an interesting article on where social media is going...
Six Social Media Trends for 2010

Thursday 29 October 2009

Configuring MSDTC (Distributed Transaction Co-ordinator)

MSDTC is required if transactions span multiple servers (i.e. linked servers, ugh!)

Firstly, the SERVICE needs to be running >



Secondly, you need to allow communication (Allow Inbound and Allow Outbound tickboxes below) >

Wednesday 28 October 2009

Dynamic Management Objects : sys.dm_exec_query_stats

Sys.dm_exec_query_stats is a dmv (dynamic management view) which stores summary information about queries in the query cache.

Sys.dm_exec_sql_text(sql_handle) is a function that returns the executed command from sql_handle.

Putting them together with CROSS APPLY (APPLY lets you join the output of a function), you can see what is being run and how often.
SELECT t.text , s.*
FROM sys.dm_exec_query_stats  s
CROSS APPLY sys.dm_exec_sql_text(sql_handle) t 
WHERE t.text NOT like 'SELECT * FROM(SELECT coalesce(object_name(s2.objectid)%'
ORDER BY execution_count DESC

This provides you 35 columns summarising query activity and includes counts, times, reads, writes statistics.

SQLDenis has provided a great query, which i've slightly adapted to order by the the most commonly executed queries.
It tells you where the sql is called from (ProcedureName) or replaces it with 'Ad-hoc' if not called from a procedure.
SELECT * FROM(SELECT COALESCE(OBJECT_NAME(s2.objectid),'Ad-Hoc') AS ProcedureName,execution_count,
(SELECT TOP 1 SUBSTRING(s2.TEXT,statement_start_offset / 2+1 ,
( (CASE WHEN statement_end_offset = -1
THEN (LEN(CONVERT(NVARCHAR(MAX),s2.TEXT)) * 2)
ELSE statement_end_offset END)  - statement_start_offset) / 2+1))  AS sql_statement,
last_execution_time
FROM sys.dm_exec_query_stats AS s1
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2 ) x
WHERE sql_statement NOT like 'SELECT * FROM(SELECT coalesce(object_name(s2.objectid)%'
ORDER BY execution_count DESC


SQLDenis's original post >http://blogs.lessthandot.com/index.php/DataMgmt/DBProgramming/finding-out-how-many-times-a-table-is-be-2008

MSDN Reference : http://msdn.microsoft.com/en-us/library/ms189741.aspx

Tuesday 27 October 2009

Dynamic Management Objects : sys.dm_db_index_usage_stats

Sys.dm_db_index_usage_stats is a really helpful view that returns information on index usage.

Using AdventureWorks2008, we can list the indexes belonging to the TransactionHistory table ...
select * from sys.indexes
where object_id = object_id('production.transactionhistory')


Here we find out how they have been used ...
select i.name, s.* from sys.dm_db_index_usage_stats s
inner join sys.indexes i
on s.object_id = i.object_id
and s.index_id = i.index_id
where database_id = DB_ID('adventureworks2008')
and s.object_id = object_id('production.transactionhistory')


The view returns counts of seeks, scans and lookup operations as well as timestamps for the latest ones ...



This view is utilised by my Unused index and Index Sizes script as well as this simple example showing index usage in the current database.

Louis Davidson has an excellent article here >
http://sqlblog.com/blogs/louis_davidson/archive/2007/07/22/sys-dm-db-index-usage-stats.aspx

The MSDN reference is here >
http://msdn.microsoft.com/en-us/library/ms188755.aspx

Friday 23 October 2009

MCITP : SQL 2008 Database Developer

Following passing my MCTS on Monday , I passed MCITP Database Developer 2008 (exam 70-451) this afternoon.

Very relieved to get my evenings back. :)
r

Wednesday 21 October 2009

A trip down memory lane...

On the eve of the Windows 7 launch, Windows over the years...

http://news.bbc.co.uk/1/hi/technology/8316268.stm

Can personally remember using Windows 2 and running a Windows 3 network for school :)

Tuesday 20 October 2009

Cursor Template TSQL

DECLARE  @id BIGINT 
DECLARE tablecursor CURSOR FORWARD_ONLY FOR 
SELECT id 
FROM   SCHEMA.table 

OPEN tablecursor 
FETCH NEXT FROM tablecursor 
INTO @id 

WHILE (1 = 1) 
BEGIN 

IF @@FETCH_STATUS <> 0 
BREAK; 

PRINT @id 
--DO SOMETHING USEFUL HERE ! 

FETCH NEXT FROM tablecursor 
INTO @id 

END 

CLOSE tablecursor 
DEALLOCATE tablecursor 

Bookmark : Constraints on Large Text Columns

An interesting way of enforcing unique constraints which tackles the problem of index size by using HASHBYTES (like CHECKSUM but better)

SQL Server Unique Constraints for Large Text Columns

Demystified : How SQL Server uses threads

Data files & threads - Clarification (From Steve Jones' blog).
"The bottom line is that SQL Server uses a thread for each unique disk drive, NOT files. In SQL 2000, you can fool the system if the disk drives are volumes on the same physical disk. However in SQL 2005 the system checks to see if these are the same physical disk.

There are some caveats, but if you hear this rumor, dispel it. Let people know that multiple file groups (or files) only help if you have different physical I/O paths and drives."
http://www.sqlservercentral.com/blogs/steve_jones/archive/2009/10/13/sql-server-legend-data-files-and-threads.aspx

Therefore performance is only a reason for partitioning when separate disks are involved.
Obviously, partitioning is still worth the effort if you wish to reduce backup sizes & times by implementing a strategy that includes it.

Also, if data is partitioned and then your boss stumps up for another drive, you're ready to move some partitions!

SQL 2008 : Filtered Indexes

Nice new functionality in SQL 2008, Filtered Indexes are indexes defined with a WHERE clause!

Nothing further here until I have time to explore, implement and write my own experiences...

http://sqlblogcasts.com/blogs/grumpyolddba/archive/2008/07/11/filtered-indexes-really-cool.aspx


http://sqlfool.com/2009/04/filtered-indexes-what-you-need-to-know/


http://www.mssqltips.com/tip.asp?tip=1785

Monday 19 October 2009

MCTS : SQL 2008, Database Development

In Septembar I gained my MCTS & MCITP in SQL Server Administration.

This afternoon I'm happy to say I got MCTS Database Development, exam 70-433.

r

TSQL : Partitioning an existing table

A few months back I published  SQL 2005 / 2008 : Table Partitioning - Run through , just enough notes to get Partitioning up and running (on SQL Enterprise / Developer editions).

What I forgot to post was how to move existing tables to the partition -

This is achieved easily i.e. by adding a clustered index that uses the partition scheme -

USE TestDB;
GO
--- Step 8 : (Optional/Recommended) Create Index on Partitioned Table
CREATE UNIQUE CLUSTERED INDEX IX_TestTable
ON TestTable(ID)
ON TestDB_PartitionScheme (ID);
GO 

Sunday 18 October 2009

Persisted Columns Problem : Persisting a time difference

Below you'll see I attempt to define 2 calculated columns as PERSISTED...
CREATE TABLE [dbo].[WebActivity](
[Id] [bigint] IDENTITY(1,1) NOT NULL,
[Url] [varchar] (255) NULL,
[TrackingDateTime] [datetime2] (0) NOT NULL,
[TrackingSeconds] AS DATEDIFF(SECOND,'20000101',TrackingDateTime) PERSISTED,
[TrackingDay] AS DATEDIFF(DAY,'20000101',TrackingDateTime) PERSISTED)
ON [PRIMARY]
GO

This fails however, with the error...

Msg 4936, Level 16, State 1, Line 1
Computed column 'TrackingSeconds' in table 'WebActivity' cannot be persisted because the column is non-deterministic.

What does this mean, then?

According to MSDN...

" Deterministic functions always return the same result any time they are called with a specific set of input values. "

and further down, the answer...

" Deterministic unless used with datetime, smalldatetime, or sql_variant. The datetime and smalldatetime data types are deterministic if the style parameter is also specified. "

So to obtain my 'Deterministic datetime2 difference I have used 2 CONVERT statements.
The first declares my defined epoch of 2000-01-01 as a DATETIME2.
The second casts the entire result back to a numeric data type.

CREATE TABLE [dbo].[WebActivity](
[Id] [bigint] IDENTITY(1,1) NOT NULL,
[Url] [varchar] (255) NULL,
[TrackingDateTime] [datetime2] (0) NOT NULL,
[TrackingSeconds] AS CONVERT(INTEGER,DATEDIFF(SECOND,CONVERT(DATETIME2,'20000101',112),TrackingDateTime)) PERSISTED,
[TrackingDay] AS CONVERT(SMALLINT,DATEDIFF(DAY,CONVERT(DATETIME2,'20000101',112),TrackingDateTime)) PERSISTED,

ON PRIMARY
GO


Saturday 17 October 2009

Visual Studio Add-in : BIDS Helper

BIDS Helper is a FREE add-in for BIDS/Visual Studio

As I’m currently working with Integration Services , these features are very much of interest >

* Create Fixed Width Columns
* Deploy SSIS Packages
* Design Warnings
* dtsConfig File Formatter
* Expression and Configuration Highlighter
* Expression List
* Fix Relative Paths
* Non-Default Properties Report
* Pipeline Component Performance Breakdown
* Reset GUIDs
* Smart Diff
* Sort Project Files
* Sortable Package Properties Report
* SSIS Performance Visualization
* Variables Window Extensions

Personally I’m particularly interested in the Performance Studio feature...

The Non-Default Properties Report is also excellent for when you lose track of what you’ve changed, sorry tuned :)

Friday 16 October 2009

Wednesday 14 October 2009

Free Tool : OpenDBDiff

This morning's SQLServerCentral email features a new (free and open source) tool called opendbdiff.
Basicly it is a free alternative to Redgate's SQL Compare and is compatible with SQL 2005 & SQL 2008.

You can download it from the codeplex site here.

The 'Mak' reviews it here.

Tuesday 13 October 2009

SSIS : Introducing a wait step in the Control Flow

In a processing system I wanted SSIS to pause for a period of time (say 1 minute) to allow more data to arrive in my Import db.

The obvious solution was an empty 'For Loop' Container, constrained by looping until a minute had passed. Whilst trialing this approach I noticed the Physical Memory usage slowly increase (until it eventually consumed all memory and the package crashed).

My new approach is using an 'Execute SQL' task (shown below).

Basically, I am using the TSQL WAITFOR command, like this > WAITFOR DELAY '00:01:00'.
I am happy to report no memory issues and my package is running continuously with no unwanted side effects...

Wednesday 7 October 2009

SQL 2008 : Sparse Columns

If a column is sparsely (rarely) populated, you can benefit by declaring it as SPARSE.
By declaring columns where the majority of data is NULL as 'sparse' you can save storage space.

The Technet Explanation of Sparse column functionality.

As ever, Pinal Dave has investigated Sparse columns fully.

Monday 5 October 2009

Finding Page Splits

Finding Page splits by using undocumented function fn_dblog (this queries the transaction log) -

SELECT *
FROM ::fn_dblog(NULL, NULL)
WHERE operation = 'LOP_DELETE_SPLIT'
ref ; http://killspid.blogspot.com/2006/07/using-fndblog.html

You can summarise them like this -
Select COUNT(1) AS NumberOfSplits, AllocUnitName , Context
From fn_dblog(NULL,NULL)
Where operation = 'LOP_DELETE_SPLIT'
Group By AllocUnitName, Context
Order by NumberOfSplits desc 
ref - Identifying Page Splits


Other ways to monitor  page splits -

Recommended links about Page Allocation

MS CSS Sql Server Engineers : How It Works: SQL Server Page Allocations

Recommended links about Page Splits

Tony Rogerson : What is a page split and why does it happen?
SQL Server Performance : At what point should I worry about page splits?
Michelle Ufford : Page Splitting & Rollbacks
Michelle Ufford : sys.fn_physLocCracker (SQL 2008 Undoumented function)

Sunday 4 October 2009

Minimising Data Compression Rebuild Time

ALTER TABLE REBUILD WITH(DATA COMPRESSION=PAGE,MAXDOP=8)


Implementing table compression use server MAXDOP (sp_configure 'max degree of parallelism') if you dont set MAXDOP in statement.

A good explanation of page compression and the MAXDOP setting is here >
http://sqlblog.com/blogs/linchi_shea/archive/2008/05/05/sql-server-2008-page-compression-using-multiple-processors.aspx

Saturday 3 October 2009

TSQL : While Loop (loop without cursor)

Really simple this, but i keep using so I thought I'd post.
How to loop in TSQL (without a CURSOR)
DECLARE @Counter INT
DECLARE @CounterText VARCHAR(30)
DECLARE @MaxCount INT
DECLARE @Stepsize INT

SELECT @Counter = 0
SELECT @MaxCount = 40000
SELECT @Stepsize = 1000

WHILE @Counter <= @MaxCount  
BEGIN   
 SET @CounterText = CONVERT(VARCHAR(30),@Counter,23)
 RAISERROR (@CounterText, 10, 1) WITH NOWAIT   
 -- do useful stuff here
 SELECT @Counter = @Counter + @Stepsize 
END 
and counting down....
DECLARE @Counter INT
DECLARE @CounterText VARCHAR(30)
DECLARE @MinCount INT
DECLARE @Stepsize INT

SELECT @Counter = 40000
SELECT @MinCount = 0
SELECT @Stepsize = 1000

WHILE @Counter >= @MinCount  
BEGIN   
 SET @CounterText = CONVERT(VARCHAR(30),@Counter,23)
 RAISERROR (@CounterText, 10, 1) WITH NOWAIT   
 -- do useful stuff here
 SELECT @Counter = @Counter - @Stepsize  
END 

Wednesday 30 September 2009

SQL 2008 : High Availability Whitepaper

Just out, a whitepaper on achieving High Availability in SQL 2008

http://msdn.microsoft.com/en-us/library/ee523927.aspx


Looking at the feature table, i'm glad I'm running Enterprise edition!

Tuesday 29 September 2009

TSQL : Comma Separated List of Columns

Like the title says, how to generate a comma separated list of columns for a given table...

DECLARE @SchemaName VARCHAR(100)
DECLARE @TableName  VARCHAR(100)
DECLARE @CommaSeparatedColumnList VARCHAR(MAX)

SET @SchemaName = 'myschema'
SET @TableName  = 'mytable'
SET @CommaSeparatedColumnList = ''

SELECT @CommaSeparatedColumnList = COALESCE(@CommaSeparatedColumnList + '[' + COLUMN_NAME + '],','')
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = @SchemaName
AND TABLE_NAME = @TableName          
ORDER BY ORDINAL_POSITION
SET @CommaSeparatedColumnList = LEFT(@CommaSeparatedColumnList,LEN(@CommaSeparatedColumnList)-1)

SELECT @CommaSeparatedColumnList

Monday 28 September 2009

2 ways to audit all logins

1) Via Server Properties -


2) via a TSQL script -
USE [master]
GO
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'AuditLevel', REG_DWORD, 3
GO

Friday 25 September 2009

MCITP : Database Administrator 2008

Following passing my MCTS on Monday , I passed MCITP Database Administrator 2008 this morning.

Contemplating the development track (70-433 & 70-451) next month to bring me up to date.

r

Thursday 24 September 2009

Tuesday 22 September 2009

MCTS : Database Administration 2008

Just a quick note to say I passed 70-432 : Microsoft SQL Server 2008, Implementation and Maintenance yesterday.

I used the MCTS Self-Paced Training Kit (Exam 70-432) if you want to do the same.

r

Sunday 20 September 2009

Stored Procedures : Execute as owner

Using EXECUTE AS OWNER in a stored procedure definition allows you to raise permissions for the execution of the procedure.

This enables a application login with low privileges to perform owner (dbo) privileged functionality using just the execute permissions on the sproc.

CREATE PROCEDURE dbo.EmptyMyTable
WITH EXECUTE AS OWNER
AS
BEGIN
TRUNCATE dbo.TableA
END

Clay Lenhart : SQL Server Security with EXECUTE AS OWNER

Saturday 19 September 2009

SSIS : Using Stored Procedures with an OLEDB Connection in SSIS

Using Stored Procedures with an OLEDB Connection in SSIS

Large recordsets cause visual studio to freeze for long periods of time, giving the impression it has hung.
(This happens repeatedly when navigating data connections).

The Solution is to pass metadata up front to SSIS, hence avoiding LONG delays whilst SSIS validates your recordset.

There is a widely publicised work-around of using 'SET FMTONLY ON' to pass metadata only (passing 'SET FMTONLY ON; EXEC dbo.proc1') but I had limited success with this.


Basically, declare an empty record set of the correct type at the start of the procedure >

CREATE PROCEDURE dbo.usp_SSIS_Data_Fetch (@eventid BIGINT, @rows BIGINT) AS

BEGIN
SET NOCOUNT ON



-- Dummy Records to pass datatypes back to SSIS for validation without returning entire recordset.

-- (note, this expression always evaluates to false so is never used except to declare a result set of

-- the correct data types at the start of the procedure.



IF 1 = 0

BEGIN

SELECT
CAST(NULL AS BIGINT) AS [ID]
,CAST(NULL AS VARCHAR(255)) AS [TrackingUrl]
,CAST(NULL AS VARCHAR(255)) AS [SearchTerm]
END

-- Genuine Result set

SELECT ID, TrackingUrl, SearchTerm FROM dbo.TrackingTable

END
GO

In addition to this you have a ValidateExternalMetadata property on an OLEDB Source inside a data flow. You can set this to false if you are confident the output of your Stored Procedure is not going to change!


Links :
http://munishbansal.wordpress.com/2009/02/18/set-fmtonly-on-useful-in-tsql-ssis-packages-for-using-temp-tables/
http://sqlserverpedia.com/blog/sql-server-bloggers/ssis-stored-procedure-metadata/

Friday 18 September 2009

Multiple CTEs in one statement

Putting here as this had me stumped...
How to use CTEs (Common Table Expressions) together -

WITH
  cte1 as (SELECT * FROM schema.table1),
  cte2 as (SELECT * FROM schema.table2)

SELECT * FROM cte1 UNION SELECT * FROM cte2

Wednesday 16 September 2009

SQL Server blocked access to procedure 'sys.sp_OACreate'

Executed as user: Domain\SQLServiceAgent. SQL Server blocked access to procedure 'sys.sp_OACreate' of component 'Ole Automation Procedures' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ole Automation Procedures' by using sp_configure. For more information about enabling 'Ole Automation Procedures', see "Surface Area Configuration" in SQL Server Books Online. [SQLSTATE 42000] (Error 15281). The step failed.

The message tells us exactly what to do, use sp_configure -

sp_configure 'show advanced options', 1
GO 
RECONFIGURE;
GO
sp_configure 'Ole Automation Procedures', 1
GO 
RECONFIGURE;
GO 
sp_configure 'show advanced options', 1
GO 
RECONFIGURE;

Sunday 13 September 2009

Career / Quote

A brilliant post from Brent Ozar, and an excellent quote...
" Being a good manager boils down to one simple thing: motivating people you dislike to do things they dislike. "
http://www.brentozar.com/archive/2009/09/managing-people-sucks-but-you-should-try-it-anyway/

Thursday 10 September 2009

SQL 2008 : Using MultiServer Queries

In SQL 2008 you can run a query against multiple servers at the same time by opening a query window against a server group rather than an individual server.

To demonstrate, a query that fetches data about a server -
SELECT
  TOP 1 BACKUP_START_DATE AS LAST_BACKUP
, @@VERSION AS SQL_VERSION
, CREATE_DATE AS LAST_STARTUP
, DATEDIFF(D,CREATE_DATE,GETDATE()) AS DaysUptime
FROM MSDB.DBO.BACKUPSET (NOLOCK)
CROSS JOIN SYS.DATABASES (NOLOCK)
WHERE SYS.DATABASES.NAME = 'TEMPDB'
ORDER BY BACKUP_SET_ID DESC

By executing against the group we get it executed against each server, hence a row for each >

News / Humour / Irony !

A colleague sent me this. A pigeon in South Africa is 'faster than broadband'

http://news.bbc.co.uk/1/hi/world/africa/8248056.stm

Wednesday 9 September 2009

SQL 2008 : SET ROWCOUNT Deprecated

Spotted this in Books Online today...
'Using SET ROWCOUNT will not affect DELETE, INSERT, and UPDATE statements in the next release of SQL Server. Do not use SET ROWCOUNT with DELETE, INSERT, and UPDATE statements in new development work, and plan to modify applications that currently use it. Also, for DELETE, INSERT, and UPDATE statements that currently use SET ROWCOUNT, we recommend that you rewrite them to use the TOP syntax. '
So any nibble delete or update functions using SET ROWCOUNT are going to have to be looked at.

Think thats plenty enough warning though....

Tuesday 8 September 2009

Server 'dev-02' is not configured for RPC.

Msg 7411, Level 16, State 1, Procedure usp_procname, Line 23
Server 'dev-02' is not configured for RPC.

These linked server options allow you to execute a stored procedure against a remote data source.
exec sp_serveroption @server='dev-02', @optname='rpc', @optvalue='true'
exec sp_serveroption @server='dev-02', @optname='rpc out', @optvalue='true'

Monday 7 September 2009

Immense do-it-yourself storage solution

Was sent this by a colleague. clever stuff...

http://blog.backblaze.com/2009/09/01/petabytes-on-a-budget-how-to-build-cheap-cloud-storage/

TSQL : Who is connected and how !

Adapted from a newsgroup posting.
Gives authentication and Windows login information too -
select s.session_id,
s.host_name,
s.program_name,
s.client_interface_name,
s.login_name,
s.nt_domain,
s.nt_user_name,
c.auth_scheme,
c.client_net_address,
c.local_net_address,
--c.connection_id,
--c.parent_connection_id,
--c.most_recent_sql_handle,
(select text from master.sys.dm_exec_sql_text(c.most_recent_sql_handle )) as sqlscript,
(select db_name(dbid) from master.sys.dm_exec_sql_text(c.most_recent_sql_handle )) as databasename,
(select object_id(objectid) from master.sys.dm_exec_sql_text(c.most_recent_sql_handle )) as objectname
from sys.dm_exec_sessions s inner join sys.dm_exec_connections c
on c.session_id=s.session_id
--where login_name='XXXXX'

Wednesday 2 September 2009

SQL 2008 : Uncompressed Objects Procedure

SQL Server Central have published my latest procedure as 'script of the day'.

The procedure is based arond SQL 2008 compression functionality and provides >
  1. Lists of tables & indexes without compression
  2. Lists of tables & indexes not using the desired compression (e.g. ROW when you've specified a compression type of PAGE)
  3. TSQL commands to compress the database objects.
SSC : SQL 2008 : Uncompressed Objects Procedure


March 2011 Update : Putting the function here too now as SSC exclusivity period over...

CREATE PROCEDURE dbo.UncompressedObjects (@database VARCHAR(50) = '' ,@emailrecipients VARCHAR(1000) = '' ,@emailprofile VARCHAR(50) = '' ,@compressiontype VARCHAR(4) = 'PAGE')
AS
BEGIN

/*
Procedure : dbo.UncompressedObjects
Version   : 1.0 (August 2009)
Author    : Richard Doering
Web       : http://sqlsolace.blogspot.com
*/

SET NOCOUNT ON

-- Check supplied parameters
IF @database = '' 
 BEGIN 
  PRINT 'Database not specified'
  RETURN 
 END
IF @database NOT IN (SELECT name FROM sys.databases) 
 BEGIN 
  PRINT 'Database ' + @database + ' not found on server ' + @@SERVERNAME
  RETURN 
 END
IF @emailrecipients = '' AND @emailprofile <> '' 
 BEGIN 
  PRINT 'Email profile given but recipients not specified'
  RETURN 
 END
IF @emailrecipients <> '' AND @emailprofile = '' 
 BEGIN 
  PRINT 'Email recipients given but profile not specified'
  RETURN 
 END
SET @compressiontype = UPPER(LTRIM(RTRIM(@compressiontype)))
IF @compressiontype NOT IN ('PAGE', 'ROW')
 BEGIN 
  PRINT 'CompressionType must be PAGE or ROW'
  RETURN 
 END
 
-- Declare variables
DECLARE @indexreport VARCHAR(MAX)
DECLARE @missingindexcompressiontsql VARCHAR(MAX)
DECLARE @missingindextablelist VARCHAR(MAX)
DECLARE @missingindexindexlist VARCHAR(MAX)
DECLARE @missingcompressiontablecount INT
DECLARE @missingcompressionindexcount INT
DECLARE @changeindexcompressiontsql VARCHAR(MAX)
DECLARE @changeindextablelist VARCHAR(MAX)
DECLARE @changeindexindexlist VARCHAR(MAX)
DECLARE @changecompressiontablecount INT
DECLARE @changecompressionindexcount INT
DECLARE @CurrentRow INT
DECLARE @TotalRows INT
DECLARE @Objecttype VARCHAR(10)
DECLARE @objectname VARCHAR(100)
DECLARE @command VARCHAR(1000)
DECLARE @emailsubject VARCHAR(100)
DECLARE @dynamicsql VARCHAR(MAX)               

-- Create temporary tables.
-- These are used because they're scope is greater than a tablevariable i.e. we can pull results back from dynamic sql.
IF EXISTS (SELECT * FROM tempdb..sysobjects WHERE name LIKE '##MissingCompression%')
   DROP TABLE ##MissingCompression
IF EXISTS (SELECT * FROM tempdb..sysobjects WHERE name LIKE '##ChangeCompression%')
   DROP TABLE ##ChangeCompression      
CREATE TABLE ##MissingCompression
     (uniquerowid INT IDENTITY ( 1 , 1 ) PRIMARY KEY NOT NULL,
                   objecttype VARCHAR(10),
                   objectname VARCHAR(100),
                   command VARCHAR(500));
CREATE TABLE ##ChangeCompression
     (uniquerowid INT IDENTITY ( 1 , 1 ) PRIMARY KEY NOT NULL,
                   objecttype VARCHAR(10),
                   objectname VARCHAR(100),
                   command VARCHAR(500));
                   
-- Work out what indexes are missing compression and build the commands for them
SET @dynamicsql =
'WITH missingcompression
     AS (SELECT ''Table''  AS objecttype,
                s.name + ''.'' + o.name AS objectname,
                ''ALTER TABLE ['' + s.name + ''].['' + o.name + ''] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = ' + @compressiontype + ');'' AS command
         FROM  ' + @database + '.sys.objects o
                INNER JOIN  ' + @database + '.sys.partitions p
                  ON p.object_id = o.object_id
                INNER JOIN  ' + @database + '.sys.schemas s
                  ON s.schema_id = o.schema_id
         WHERE  TYPE = ''u''
                AND data_compression = 0
                AND Schema_name(o.schema_id) <> ''SYS''
         UNION
         SELECT ''Index'' AS objecttype,
                i.name AS objectname,
                ''ALTER INDEX ['' + i.name + ''] ON ['' + s.name + ''].['' + o.name + ''] REBUILD WITH ( DATA_COMPRESSION = ' + @compressiontype + ');'' AS command
         FROM   ' + @database + '.sys.dm_db_partition_stats ps
                INNER JOIN ' + @database + '.sys.indexes i
                  ON ps.[object_id] = i.[object_id]
                     AND ps.index_id = i.index_id
                     AND i.type_desc <> ''HEAP''
                INNER JOIN ' + @database + '.sys.objects o
                  ON o.[object_id] = ps.[object_id]
                INNER JOIN ' + @database + '.sys.schemas s
                  ON o.[schema_id] = s.[schema_id]
                     AND s.name <> ''SYS''
                INNER JOIN ' + @database + '.sys.partitions p
                  ON p.[object_id] = o.[object_id]
                     AND data_compression = 0)
                     
-- populate temporary table ''##MissingCompression''
INSERT INTO ##MissingCompression (objecttype, objectname, command)
SELECT objecttype, objectname, command FROM missingcompression ORDER BY objectname ASC, command DESC '
exec (@dynamicsql)

SET @dynamicsql =
'WITH changecompression
     AS (SELECT ''Table''  AS objecttype,
                s.name + ''.'' + o.name AS objectname,
                ''ALTER TABLE ['' + s.name + ''].['' + o.name + ''] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = ' + @compressiontype + ');'' AS command
         FROM  ' + @database + '.sys.objects o
                INNER JOIN  ' + @database + '.sys.partitions p
                  ON p.object_id = o.object_id
                INNER JOIN  ' + @database + '.sys.schemas s
                  ON s.schema_id = o.schema_id
         WHERE  TYPE = ''u''
                AND data_compression <> 0
                AND data_compression_desc <> ''' + @compressiontype + ''' 
                AND Schema_name(o.schema_id) <> ''SYS''
         UNION
         SELECT ''Index'' AS objecttype,
                i.name AS objectname,
                ''ALTER INDEX ['' + i.name + ''] ON ['' + s.name + ''].['' + o.name + ''] REBUILD WITH ( DATA_COMPRESSION = ' + @compressiontype + ');'' AS command
         FROM   ' + @database + '.sys.dm_db_partition_stats ps
                INNER JOIN ' + @database + '.sys.indexes i
                  ON ps.[object_id] = i.[object_id]
                     AND ps.index_id = i.index_id
                     AND i.type_desc <> ''HEAP''
                INNER JOIN ' + @database + '.sys.objects o
                  ON o.[object_id] = ps.[object_id]
                INNER JOIN ' + @database + '.sys.schemas s
                  ON o.[schema_id] = s.[schema_id]
                     AND s.name <> ''SYS''
                INNER JOIN ' + @database + '.sys.partitions p
                  ON p.[object_id] = o.[object_id]
                     AND data_compression <> 0
                     AND data_compression_desc <> ''' + @compressiontype + ''' )
                     
-- populate temporary table ''##ChangeCompression''
INSERT INTO ##ChangeCompression (objecttype, objectname, command)
SELECT objecttype, objectname, command FROM changecompression ORDER BY objectname ASC, command DESC '
exec (@dynamicsql)

-- We now have populated our temporary tables (##MissingCompression & ##ChangeCompression)

-- First, loop objects with no compression.
-- For each object >
--  1) increment the counter, 
--  2) add the object name to the list for display 
--  3) generate the tsql for compression commands

  -- set initial variables
  SET @missingindexcompressiontsql = ''
  SET @missingindextablelist = ''
  SET @missingindexindexlist = ''
  SET @missingcompressiontablecount = 0
  SET @missingcompressionindexcount = 0
  SELECT @TotalRows = Count(* ) FROM ##MissingCompression
  SELECT @CurrentRow = 1

  WHILE @CurrentRow <= @TotalRows
    BEGIN
   SELECT @Objecttype = objecttype,
      @objectname = objectname,
      @command = command
   FROM   ##MissingCompression
   WHERE  uniquerowid = @CurrentRow
      
   SET @missingindexcompressiontsql = @missingindexcompressiontsql + @command + Char(10) + Char(10) 
     
   IF @Objecttype = 'table'
     BEGIN
    SET @missingindextablelist = @missingindextablelist + @objectname + Char(10)     
    SET @missingcompressiontablecount = @missingcompressiontablecount + 1
     END
      
   IF @Objecttype = 'index'
     BEGIN
    SET @missingindexindexlist = @missingindexindexlist + @objectname + Char(10)
    SET @missingcompressionindexcount = @missingcompressionindexcount + 1
     END
      
   SELECT @CurrentRow = @CurrentRow + 1
    END
  
  
-- Now deal with Objects that need to change compression type
-- For each object >
--  1) increment the counter, 
--  2) add the object name to the list for display 
--  3) generate the tsql for compression commands

    -- set initial variables
  SET @changeindexcompressiontsql = ''
  SET @changeindextablelist = ''
  SET @changeindexindexlist = ''
  SET @indexreport = ''
  SET @changecompressiontablecount = 0
  SET @changecompressionindexcount = 0
  SELECT @TotalRows = Count(* ) FROM ##ChangeCompression
  SELECT @CurrentRow = 1

  WHILE @CurrentRow <= @TotalRows
    BEGIN
   SELECT @Objecttype = objecttype,
      @objectname = objectname,
      @command = command
   FROM   ##ChangeCompression
   WHERE  uniquerowid = @CurrentRow
      
   SET @changeindexcompressiontsql = @changeindexcompressiontsql + @command + Char(10) + Char(10)
     
   IF @Objecttype = 'table'
     BEGIN
    SET @changeindextablelist = @changeindextablelist + @objectname + Char(10)     
    SET @changecompressiontablecount = @changecompressiontablecount + 1
     END
      
   IF @Objecttype = 'index'
     BEGIN
    SET @changeindexindexlist = @changeindexindexlist + @objectname + Char(10)
    SET @changecompressionindexcount = @changecompressionindexcount + 1
     END
      
   SELECT @CurrentRow = @CurrentRow + 1
    END

   -- Build the text output for the report  >
   -- First for objects missing compression >
  IF (@missingcompressionindexcount + @missingcompressiontablecount) > 0
    BEGIN
   IF (@missingcompressiontablecount) > 0
     BEGIN
    SET @indexreport = @indexreport + 'Tables not currently utilising ' + @compressiontype + ' compression >' + Char(10) +  '--------------------------------------------' + Char(10) + @missingindextablelist + Char(13) + Char(13)
     END      
   IF (@missingcompressionindexcount) > 0
     BEGIN
    SET @indexreport = @indexreport + 'Indexes not currently utilising ' + @compressiontype + ' compression >' + Char(10) +  '---------------------------------------------' + Char(10) + @missingindexindexlist + Char(13) + Char(13)
     END
    END
 
  -- Now for objects using the incorrect compression type >
  IF (@changecompressionindexcount + @changecompressiontablecount) > 0
    BEGIN
   IF (@changecompressiontablecount) > 0
     BEGIN
    SET @indexreport = @indexreport + 'Tables with incorrect compression type >' + Char(10) + '--------------------------------------------' + Char(13) + Char(10) + @changeindextablelist + Char(13) + Char(10)
     END      
   IF (@changecompressionindexcount) > 0
     BEGIN
    SET @indexreport = @indexreport + 'Indexes with incorrect compression type >' + Char(10) + '---------------------------------------------' + Char(13) + Char(10) + @changeindexindexlist + Char(13) + Char(10)
     END
    END
  IF (@missingcompressionindexcount + @missingcompressiontablecount) > 0
   BEGIN
    SET @indexreport = @indexreport + char(10) + '/* TSQL to implement ' + @compressiontype + ' compression */' + Char(10) + '-----------------------------------' + Char(10) + 'USE [' + @database + ']' + Char(10) + 'GO' + Char(10) + @missingindexcompressiontsql + Char(13) + Char(10)
   END
 IF (@changecompressionindexcount + @changecompressiontablecount) > 0
   BEGIN
    SET @indexreport = @indexreport + char(10) + '/* TSQL to change to ' + @compressiontype + ' compression type */' + Char(10)  + '-------------------------------------' + Char(10) + 'USE [' + @database + ']' + Char(10) + 'GO' + Char(10) + @changeindexcompressiontsql + Char(13) + Char(10)
   END 
-- Tidy up. Remove the temporary tables.
DROP TABLE ##MissingCompression
DROP TABLE ##ChangeCompression

-- Display report and email results if there are any required actions >
IF ( (@changecompressionindexcount + @changecompressiontablecount + @missingcompressionindexcount + @missingcompressiontablecount) > 0)
 BEGIN
  -- Compression changes recommended, display them
  PRINT @indexreport
  -- If email paramters supplied, email the results too.
  IF @emailrecipients <> '' AND @emailprofile <> '' 
   BEGIN
    SET @emailsubject =  @@SERVERNAME + ' : Uncompressed object report : ' + @database + ' (' + @compressiontype + ' compression)'
    -- send email
    EXEC msdb.dbo.sp_send_dbmail
     @recipients = @emailrecipients,
     @subject = @emailsubject,
     @body = @indexreport, 
     @profile_name = @emailprofile
   END  
  END
 ELSE
  BEGIN
   PRINT 'No database objects to compress'
  END
END

GO

CREATE PROCEDURE dbo.UncompressedServerObjects AS
BEGIN 
SET NOCOUNT ON

DECLARE  @CurrentRow INT
DECLARE  @TotalRows INT
DECLARE  @DatabaseName NVARCHAR(255)                   
DECLARE  @Databases  TABLE(
   UNIQUEROWID  INT IDENTITY ( 1,1 ) PRIMARY KEY NOT NULL,
   DATABASENAME NVARCHAR(255)
   )

SELECT @CurrentRow = 1
               
INSERT INTO @Databases (DATABASENAME)
 SELECT NAME
 FROM SYS.DATABASES
 WHERE DATABASE_ID > 4
               
SELECT @TotalRows = COUNT(*) FROM @Databases
 
WHILE @CurrentRow <= @TotalRows  
 BEGIN    
  SELECT @DatabaseName = DATABASENAME      
  FROM @Databases     
  WHERE UNIQUEROWID = @CurrentRow         

  EXEC dbo.UncompressedObjects
    @database = @DatabaseName 
  , @compressiontype = 'PAGE'
  , @emailrecipients = 'emailaddress@domain.com'
  , @emailprofile = 'Profile Name' 

  SELECT @CurrentRow = @CurrentRow + 1  
 END 
END
GO

Monday 31 August 2009

Management Studio - Password Cache

Having changed a lot of passwords recently, I found that removing (or renaming for safety) the following file helped.

(Management Studio recreates it when you next start)

C:\Documents and Settings\[username]\Application Data\Microsoft\Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin


http://stackoverflow.com/questions/349668/removing-the-remembered-login-and-password-list-in-sql-management-studio

Saturday 29 August 2009

Tools : 7zip

7zip is a file compression utility.
Like Winzip and Winzip Command line put together, oh, and it is FREE!

Download from here : http://www.7-zip.org/

I used the following command in a batch file to zip up an entire folder.
"C:\Program Files (x86)\7zip\7za.exe" a -r e:\sqlbackup\ssrs\ssrsarchive.7z e:\sqlbackup\ssrs\*.* -x!ssrsarchive.7z

the 'a' parameter at the start is tells it to add files.
the '-x' at the end tells it what to exclude (in this case the archive itself!)

Thursday 27 August 2009

SP : Testing Linked Server Availability

SQL Server comes with a system stored procedure sys.sp_testlinkedserver to test linked server availability.
Here I simply put that inside by own procedure to generate an email too.

CREATE PROCEDURE utils.[LinkedServerTest] @ServerName SYSNAME
AS 
 BEGIN
 DECLARE @Test BIT

 BEGIN TRY
 EXEC @Test= sys.sp_testlinkedserver @servername 

 PRINT 'Sucessfully connected to ' + CAST(@servername as VARCHAR(30))
 END TRY

 BEGIN CATCH
 PRINT 'Failed to connect to ' + CAST(@servername as VARCHAR(30))

 DECLARE @chvFrom VARCHAR(255)
 DECLARE @chvTo VARCHAR(255)
 DECLARE @chvSubject VARCHAR(255)
 DECLARE @chvBody VARCHAR(8000)

 SET @chvFrom = 'sql.admin@domain.co.uk'
 SET @chvTo = 'sql.admin@domain.co.uk'
 SET @chvSubject = 'Linked Server Connnection Failure : ' + @servername + ' cannot be accessed from ' + @@SERVERNAME
 SET @chvBody  =  @chvSubject

 EXEC msdb.dbo.sp_send_dbmail 
    @profile_name='Mail Profile'
  , @recipients=@chvTo
  , @body=@chvBody
  , @subject=@chvSubject
  , @importance='High'
  
 RAISERROR ('Linked Server Failure', 16, 1, @chvSubject) WITH LOG
 
 END CATCH

 END
GO

Usage :
exec  utils.LinkedServerTest @ServerName = 'my linked server'

Wednesday 26 August 2009

WINSXS folder size explained

A very good post that explains te difference between files and 'hard links' >

http://www.davidlenihan.com/2008/11/winsxs_disk_space_usage_its_no.html

WINSXS still takes up a lot of space, even without the 'hard links' mind!

Monday 24 August 2009

Task Scheduler Task does not run - Error 2147943785

Googling this, all i could come across was >
Local policies -> user rights assignment & add the user
running the task to the "log on as a batch job" Policy.

Seeing i'm running a domain I needed to adjust my domain account via Group Policy >

Created account Domain\TaskScheduler (with usual random 25 character, punctuation etc 25 password)

I added the account to the 'Log in as a batch job' policy , ran gpupdate on my machines but it wasnt enough.

I solved the issue by placing the account in my service accounts group Domain\Service Accounts

The group has the following rights

Allow log on locally
Act as part of the operating system
Adjust memory quotas for aprocess
Bypass traverse checking
Lock Pages in memory
Log in as a batch job
Log in as a service
Perform volume maintenance tasks
Replace a process level token

Sunday 23 August 2009

Enabling VT (Virtualisation Technology) for Hyper-V hosts

If you forget to enable VT in the BIOS and try to install Hyper-V, this is what you get...

" Hyper-V cannot be installed

Server Manager has detected that the processor on this computer is not capable with Hyper-V. To install this role the processor must have a supported version of hardware assisted virtualization, and that feature must be turned on in the BIOS. "



The message is pretty self explanatory and I had indeed turned it on in the BIOS. What it doesnt tell you is that you need to cold boot after making BIOS changes i.e. physically turn off the server, wait a while and start the server back up!
Whlst in the BIOS, make sure 'Execute Disable' is 'Enabled' (confusing I felt).

Friday 21 August 2009

TSQL : Unused Indexes and Index Sizes

A quick exercise in looking at saving a little data space by dropping some unused indexes.
I'm using a CTE to draw from 2 pieces of code and recommend indexes to lose.

The first is Jason Massie's Unused Index Query and the second is my Index Size script.

WITH UnusedIndexQuery (Object_ID, ObjectName, IndexName, Index_ID, Reads, Writes, Rows) AS
(
SELECT s.object_id,
  objectname=OBJECT_NAME(s.OBJECT_ID)
, indexname=i.name
, i.index_id  
, reads=user_seeks + user_scans + user_lookups  
, writes =  user_updates  
, p.rows
FROM sys.dm_db_index_usage_stats s JOIN sys.indexes i 
ON i.index_id = s.index_id AND s.OBJECT_ID = i.OBJECT_ID  
JOIN sys.partitions p ON p.index_id = s.index_id AND s.OBJECT_ID = p.OBJECT_ID
WHERE OBJECTPROPERTY(s.OBJECT_ID,'IsUserTable') = 1  
AND s.database_id = DB_ID()  
AND i.type_desc = 'nonclustered'
AND i.is_primary_key = 0
AND i.is_unique_constraint = 0
AND p.rows > 10000
)

, IndexSizes (schemaname,tablename,object_id,indexname,index_id,indextype,indexsizekb,indexsizemb,indexsizegb) AS
(
SELECT   sys_schemas.name AS SchemaName
  ,sys_objects.name AS TableName
  ,sys_objects.[object_id] AS object_id
  ,sys_indexes.name AS IndexName
  ,sys_indexes.index_id as index_id
  ,sys_indexes.type_desc AS IndexType
  ,partition_stats.used_page_count * 8 AS IndexSizeKB
  ,CAST(partition_stats.used_page_count * 8 / 1024.00 AS Decimal(10,3))AS IndexSizeMB
  ,CAST(partition_stats.used_page_count * 8 / 1048576.00 AS Decimal(10,3)) AS IndexSizeGB
FROM sys.dm_db_partition_stats partition_stats
INNER JOIN sys.indexes sys_indexes
  ON partition_stats.[object_id] = sys_indexes.[object_id] 
    AND partition_stats.index_id = sys_indexes.index_id
    AND sys_indexes.type_desc <> 'HEAP'
INNER JOIN sys.objects sys_objects
  ON sys_objects.[object_id] = partition_stats.[object_id] 
INNER JOIN sys.schemas sys_schemas  
  ON sys_objects.[schema_id] = sys_schemas.[schema_id] 
  AND sys_schemas.name <> 'SYS'
)


select IndexSizes.* 
, UnusedIndexQuery.Reads
, UnusedIndexQuery.Writes
, UnusedIndexQuery.Rows
from UnusedIndexQuery
inner join IndexSizes
on UnusedIndexQuery.object_id = IndexSizes.object_id
and UnusedIndexQuery.index_id = IndexSizes.index_id
order by reads 

Thursday 20 August 2009

Testing SQL Version

A simple ammend to my backup stored procedure today.
If it's on a sql 2008 box, use compression and increase the BUFFERCOUNT.

IF CHARINDEX('Sql Server 2008', @@VERSION) <> 0
  BEGIN
  SET @Backupcommand = @Backupcommand + ' ,COMPRESSION, BUFFERCOUNT = 250 '
  END

Wednesday 19 August 2009

TSQL : Schemas that do not own objects

-- schemas that do not own objects
select sys.schemas.* from sys.schemas
where schema_id not in (select schema_id from sys.objects)

Tuesday 18 August 2009

SQL 2008 : Compression TSQL for uncompressed objects

A CTE (common table expression) to build the commands to compress uncompressed objects -
WITH missingcompression
   AS (SELECT Schema_name(schema_id) + '.' + name                                                                                   AS tablename,
              'ALTER TABLE [' + Schema_name(schema_id) + '].[' + name + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE);' AS command
       FROM   sys.objects
              INNER JOIN sys.partitions
                ON sys.partitions.object_id = sys.objects.object_id
       WHERE  TYPE = 'u'
              AND data_compression = 0
              AND Schema_name(sys.objects.schema_id) <> 'SYS'
       UNION
       SELECT sys_schemas.name + '.' + sys_objects.name                                                                                                   AS tablename,
              'ALTER INDEX [' + sys_indexes.name + '] ON [' + sys_schemas.name + '].[' + sys_objects.name + '] REBUILD WITH ( DATA_COMPRESSION = PAGE ) ' AS command
       FROM   sys.dm_db_partition_stats partition_stats
              INNER JOIN sys.indexes sys_indexes
                ON partition_stats.[object_id] = sys_indexes.[object_id]
                   AND partition_stats.index_id = sys_indexes.index_id
                   AND sys_indexes.type_desc <> 'HEAP'
              INNER JOIN sys.objects sys_objects
                ON sys_objects.[object_id] = partition_stats.[object_id]
              INNER JOIN sys.schemas sys_schemas
                ON sys_objects.[schema_id] = sys_schemas.[schema_id]
                   AND sys_schemas.name <> 'SYS'
              INNER JOIN sys.partitions
                ON sys.partitions.[object_id] = sys_objects.[object_id]
                   AND data_compression = 0)

SELECT command FROM missingcompression ORDER BY tablename ASC, command DESC


The previous version builds all compression commmands for tables and indexes.

Sunday 16 August 2009

SQL 2008 : Apply Compression to all Tables and Indexes

Apply Compression to all Tables and Indexes
(example uses Page Compression)
SELECT 'ALTER TABLE [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE);' FROM sys.objects where TYPE = 'u'
UNION
SELECT 'ALTER INDEX ALL ON [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE);' FROM sys.objects where TYPE = 'u'

Disabling -ve DNS Caching (XP+)

Save the following as a .reg file and double click it to submit ro the registry.


Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Dnscache\Parameters]
"ServiceDll"=hex(2):25,00,53,00,79,00,73,00,74,00,65,00,6d,00,52,00,6f,00,6f,\
 00,74,00,25,00,5c,00,53,00,79,00,73,00,74,00,65,00,6d,00,33,00,32,00,5c,00,\
 64,00,6e,00,73,00,72,00,73,00,6c,00,76,00,72,00,2e,00,64,00,6c,00,6c,00,00,\
 00
"MaxCacheEntryTtlLimit "=dword:00000001
"NegativeCacheTime"=dword:00000000
"NetFailureCacheTime"=dword:00000000
"NegativeSOACacheTime"=dword:00000000


Alternatively the full instructions / explanation are here.

Saturday 15 August 2009

SQL 2008 : Management Studio Error

Installing SQL 2008 on developer pcs I came across this when starting Management Studio.

" Unhandled exception has occurred in a component in your application. If you
click Continue, the application will ignore this error and attempt to
continue. Key not valid for use in specified state. "


Clicking 'Details' ....

" ************** Exception Text **************
System.Security.Cryptography.CryptographicException: Key not valid for use
in specified state.

at System.Security.Cryptography.ProtectedData.Unprotect(Byte[] encryptedData, Byte[] optionalEntropy, DataProtectionScope scope)

..... "

It basically means SQL 2008 is trying to use a SQL 2005 config file.

To solve this,
  1. Navigate to C:\Documents and Settings\ [username] \Application Data\Microsoft\Microsoft SQL Server\100\Tools\Shell .

  2. Rename the old RegSrvr.xml to something else e.g. RegSrvr.bak

  3. Start Management Studio again (it will recreate the file and you'll be able to add your servers again)



Link : http://www.dbtalk.net/microsoft-public-sqlserver-tools/ssms-2008-a-512328.html