Thursday, 15 December 2011
What if you've been a fool with log shipping?
First, disable the Log Shipping Copy and Log Shipping Restore Jobs on the Primary Server. You won't be able to do this by right clicking on the SQL Agent job and chosing 'disable'. You will if you go into the properties for the job. You won't, at this point, be able to delete the job.
Then, check the msdb secondary log shipping tables.
SELECT *
FROM msdb..log_shipping_secondary_databases
SELECT *
FROM msdb..log_shipping_monitor_secondary
SELECT *
FROM msdb..log_shipping_secondary
Verify that the database(s) is indeed in the list here.
Run the following SQL to delete the erroneous secondary databases
DELETE
FROM msdb..log_shipping_secondary_databases
WHERE secondary_database = 'DBName'
DELETE
FROM msdb..log_shipping_monitor_secondary
WHERE secondary_database = 'DBName'
DELETE
FROM msdb..log_shipping_secondary
WHERE primary_database = 'DBName'
After you have run this script for each database, the LSAlert job will stop erroring.
You will also, now, be able to delete the erroneous LSCopy and LSRestore jobs from SQL Agent.
As always, use at your own risk.
Wednesday, 30 November 2011
Collation
Collation can be set at all different levels in SQL, from the server, down to the column level. And it's a pain when it does't match up. Fortunately, there is a fast workaround for T-SQL queries.
SELECT a.employeeid, a.name, b.employeeid, b.name
FROM employee a, employeedetails b
WHERE a.employeeid = b.employeeid
If the collation on employeeid in table a does not match the collation for employeeid in table b, there will be a nasty error, along the lines of "Cannot resolve collation conflict for equal to operation"
Easy enough to deal with - tell SQL to use what *should* be the default for the database(s) in question. This worked when I was dealing with two databases with totally different collations on the same server - the default server collation didn't trickle down to a database imported from another, older, server.
SELECT a.employeeid, a.name, b.employeeid, b.name
FROM employee a, employeedetails b
WHERE a.employeeid COLLATE
DATABASE_DEFAULT = b.employeeid
COLLATE
DATABASE_DEFAULT
This is a lot simpler to implement than any solution that involves trying to re-collate one of the columns to the other table/database/column's collation. It's vastly more generic.
Wednesday, 23 November 2011
How to Estimate the size of a Nonclustered Index
So I put it into a spreadsheet.
The spreadsheet only deals with non-unique non-clustered indexes at the moment, because that's what I was trying to work out. It should be reasonably simple - plug in the numbers in the cells adjacent to the bold text, make sure that you're calculating the correct number of non-leaf levels in cell B53, and make sure to use it in conjunction with the link above.
This spreadsheet is provided as-is, without warranty. It's for estimating the size of an index - actual size of indexes may vary.
Wednesday, 12 October 2011
T-SQL to give domain accounts sysadmin access
CREATE LOGIN [Domain\UserName] FROM WINDOWS;
GO
EXEC master..sp_addsrvrolemember [Domain\UserName],'sysadmin'
GO
Why yes. I did just lock myself out of a server accidentally this morning. Where Management Studio wasn't installed. Go figure.
Tuesday, 19 July 2011
Reporting Services Errors
An error occurred during client rendering.
An error has occurred during report processing.
Cannot read the next data row for the dataset ThisDataSet.
For more information about this error navigate to the report server on the local server machine, or enable remote errors
Before I got as far as enabling remote errors, I tried running the report on the local server, where, of course, it was impossible to even navigate to the report as this error popped up:
User 'Domain\UserName' does not have required permissions. Verify that sufficient permissions have been granted and Windows User Account Control (UAC) restrictions have been addressed,
which was fixed by running Reports in Internet Explorer using Run As Administrator permissions, and then granting the appropriate access via the Security settings within SSRS (a little bizarre, I'm sure you'll agree, as if I could get to the report using the client machine, I should have been able to see it on the local server using exactly the same Windows login.
Then, however, the report ran perfectly happily. I poked about for the original error. I found an article telling me where to find the Report Server Log files, and looking in them, found this:
WARN: Microsoft.ReportingServices.ReportProcessing.ProcessingAbortedException: An error has occurred during report processing. ---> Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Cannot read the next data row for the dataset ThisDataSet. ---> System.Data.SqlClient.SqlException: The conversion of a nvarchar data type to a datetime data type resulted in an out-of-range value.webserver!ReportServer_0-29!26e4!07/19/2011-13:05:43:: e ERROR: Reporting Services error Microsoft.ReportingServices.Diagnostics.Utilities.RSException: An error has occurred during report processing. ---> Microsoft.ReportingServices.ReportProcessing.ProcessingAbortedException: An error has occurred during report processing. ---> Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Cannot read the next data row for the dataset ThisDataSet ---> System.Data.SqlClient.SqlException: The conversion of a nvarchar data type to a datetime data type resulted in an out-of-range value.
I looked at the dates on both versions of the report. They were both definitely dates. However. On the local server, the format was
m/dd/yyyy hh:mm:ss AM
and on the client machine it was
dd/mm/yyyy hh:mm:ss
Aha.
The date on the local server is in UTC, because that's what the third party supplier wanted. The date everywhere else is in local time. And, as a result, the date on the local server is in the format that the report expects, whereas the date on the client machine isn't...
And, since whomever wrote the report didn't think that around the world, people do use different date formats, and that reports might not be run on the local server, the whole thing came to a crashing halt.
The workaround, other than running the report on the server, is to manually change the date format to the one which works. But, my goodness, that's a faff: and what's wrong with having one of those little calendar thingies to click on?
Monday, 13 June 2011
Execute xp_cmdshell
So.
First, make sure that you have SA access on the server yourself. Then make sure the account that you want to use has access to the relevant databases on the SQL server. It can be a SQL user account or a Windows account: it doesn't matter. For the purpose of this snippet, I'm using a SQL account called 'user'. I'm going to go out on a limb, and guess that you already know how to create an SQL user account.
Then, grant the user execute rights on xp_cmdshell
GRANT EXEC ON xp_cmdshell TO [user];
Next, we need a proxy account which does have sysadmin access, and we'll use this account to run xp_cmdshell to run calls by users who are not members of the sysadmin role. This account must be a windows account. Give it access only to those drives and folders which it requires for the purposes of executing anticipated xp_cmdshell calls.
The proxy created is ##xp_cmdshell_proxy_account##, and in order to create it, you need both the Windows Login which has sysadmin access and its password.
EXEC sp_xp_cmdshell_proxy_account ‘domain\sysadminlogin’, ‘password’;
Check that this has created by querying the sys.credentials table
SELECT * FROM sys.credentials
Finally, verify that the SQL user has access to execute xp_cmdshell with the following code
EXECUTE AS LOGIN = 'user'
GO
EXEC xp_cmdshell 'whoami.exe'
REVERT
The result should be the Domain\Sysadmin account specified earlier.
Thursday, 14 April 2011
How far has my backup or restore got?
SELECT r.session_id,r.command,
CONVERT(NUMERIC(6,2),r.percent_complete) AS [Percent Complete],
CONVERT(VARCHAR(20),DATEADD(ms,r.estimated_completion_time,GetDate()),20) AS [ETA Completion Time],
CONVERT(NUMERIC(10,2),r.total_elapsed_time/1000.0/60.0) AS [Elapsed Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0) AS [ETA Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0/60.0) AS [ETA Hours],
CONVERT(VARCHAR(1000),(
SELECT SUBSTRING(text,r.statement_start_offset/2,
CASE WHEN r.statement_end_offset = -1 THEN 1000
ELSE (r.statement_end_offset-r.statement_start_offset)/2 END)
FROM sys.dm_exec_sql_text(sql_handle)))
FROM sys.dm_exec_requests r WHERE command IN ('RESTORE DATABASE','BACKUP DATABASE')
It's based on the sys.dm_exec_requests view, so works for SQL 2005 and above. The code was here but had a slight typo which meant it wouldn't work for restores in the first instance.
It still doesn't solve the problem of SQL sitting on 100% for a restore for an age and a bit - this is due to the way SQL restores work. First, they restore the data to the files, in the data-copy phase: this is the bit that racks up to 100%. Then, any committed transactions that were present in the transaction log at backup time are rolled forward, and any incomplete transactions are rolled back. This is the redo and undo phase, and SQL doesn't record the progress of this. If the log file is particularly large, and the database is in SIMPLE recovery, this can take quite a while to work through...
Wednesday, 9 February 2011
Code Snippet
It took me a while to work out how to do what I wanted to do...
This code is provided as-is, with no warranties. Test it before using it in earnest: and if you want to share the joy, please credit me. I wouldn't have managed it without the code here.
CREATE TABLE #temptable
(
[Owner] VARCHAR(50),
[Object] VARCHAR(100),
grantee VARCHAR(50),
grantor VARCHAR(50),
protecttype VARCHAR(50),
[Action] VARCHAR(50),
[Column] VARCHAR(50),
dbname VARCHAR(50)
)
DECLARE @STATEMENT VARCHAR(500)
SET @STATEMENT = 'Use [' + '?' + ']
INSERT INTO #temptable
([Owner],
[Object],
Grantee,
Grantor,
ProtectType,
[Action],
[Column]
)
EXEC (''sp_helprotect'')
UPDATE #temptable SET dbname = db_name()
WHERE dbname IS NULL'
EXEC MASTER.dbo.Sp_msforeachdb @command1=@STATEMENT
GO
SELECT *
FROM #temptable
ORDER BY dbname
GO
DROP TABLE #temptable
GO
Tuesday, 4 January 2011
Cannot generate SSPI context
This suggestion is particularly pertinent after a long holiday, such as Christmas, when your user has forgotten their password and has mistyped it several times...