Posts Tagged SQL Server
Ran across a surprising SQL Server thing today: I was updating a query If you run a query that returns no rows and use the MIN() aggregate function, you get an empty resultset as expected, but the @@ROWCOUNT variable is set to 1. If you write the query as a TOP(1) query in a situation where it returns no rows though, you still get an empty resultset but @@ROWCOUNT is set to zero as expected.
Example of this in action (tested on SQL Server 2012):
CREATE TABLE MyTable(MyId INTEGER, MyName NVARCHAR(128));
SELECT MIN(MyId) FROM MyTable WHERE MyName LIKE 'MyTag'
IF @@ROWCOUNT <> 0 PRINT 'Nonzero row count' ELSE PRINT 'Zero row count' --Output: Nonzero row count
SELECT TOP(1) MyId FROM MyTable WHERE MyName LIKE 'MyTag' ORDER BY MyId
IF @@ROWCOUNT <> 0 PRINT 'Nonzero row count' ELSE PRINT 'Zero row count' --Output: Zero row count
DROP TABLE MyTable;
To clear the cache on Oracle:
ALTER SYSTEM FLUSH BUFFER_CACHE;
ALTER SYSTEM FLUSH SHARED_POOL;
To clear the cache on SQL Server:
(I recently found these notes that I had written up in January and Feburary of 2010. I am posting them in the hope that they may be a help to me and others, even in their rough form.)
It worked to just generate all the fields, with empty fields for ones I didn’t care to fill in.
for %d in (Foo Bar Baz Quux) do bcp MyDatabase.dbo.%d format nul -T -c -f %d.fmt
This resulted in the creation of Foo.fmt, Bar.fmt, Baz.fmt, and Quux.fmt files.
9.0 10 1 SQLCHAR 0 22 "\t" 1 FooID SQL_Latin1_General_CP1_CI_AS 2 SQLINT 0 4 "\t" 2 FooTypeID "" 3 SQLINT 1 4 "\t" 3 TitleID "" 4 SQLNCHAR 2 100 "\t" 4 LastName SQL_Latin1_General_CP1_CI_AS 5 SQLNCHAR 2 100 "\t" 5 FirstName SQL_Latin1_General_CP1_CI_AS 6 SQLNCHAR 2 100 "\t" 6 MiddleName SQL_Latin1_General_CP1_CI_AS 7 SQLNCHAR 2 100 "\t" 7 Department SQL_Latin1_General_CP1_CI_AS 8 SQLNCHAR 2 200 "\t" 8 Company SQL_Latin1_General_CP1_CI_AS 9 SQLCHAR 2 22 "\t" 9 OrgID SQL_Latin1_General_CP1_CI_AS 10 SQLTINYINT 0 1 "\r\n" 10 Active ""
I had created some test data on a labmanager VM. It had taken some time — I had recorded a few calls, an email, and a chat, then tweaked the recording metadata to try out a few important aspects of the stored procedure I was testing.
It’s at these times that I get the nagging feeling — the test was so time-intensive to create and run, why not take the step of automating it so we can run it in the future and catch regressions ‘most instantly?
I wanted an SQL script that I could run that would INSERT the test data as it currently existed on my test machine. And I didn’t want to create those INSERT statements by hand. Boy, did I not want to write them all by hand.
Microsoft SQL Server 2005 comes with something called Database Publishing Wizard (look under
Program Files\Microsoft SQL Server\90\Tools\Publishing\1.2\ ). With the following command:
sqlpubwiz script -d MyDatabase -dataonly c:\temp\MyDatabase.sql
…I was able to generate INSERT statements for all the data in my little test database to an .sql script file. I then edited the file to remove things I didn’t need (and to avoid key collisions by substituting
NEWID() calls for hardcoded uniqueidentifiers as necessary).
Finally, with all the data in place, I could add the following tests. (If anything goes wrong, an error message appears in the Messages window.)
DECLARE @tempTable TABLE(BlahId uniqueidentifier, BlahType int, BlahDateUTC DateTime, BlahLength int, PhoneNumber nvarchar(60), RightsMask int); DECLARE @return_value int; INSERT INTO @tempTable EXEC @return_value = [dbo].[blah_get_blah_chunk] @i_sessionId = @sessionId, @i_chunkSize = 50; IF @return_value <> 0 RAISERROR('Return value should be zero but was %d', 15, 1, @return_value); DECLARE @expectedRecords int; DECLARE @actualRecords int; SET @expectedRecords = 5; SET @actualRecords = (SELECT COUNT(*) FROM @tempTable); IF @actualRecords <> @expectedRecords RAISERROR('Expected %d records in result set but found %d', 15, 1, @expectedRecords, @actualRecords); DECLARE @actualDisplayName nvarchar(60); DECLARE @expectedDisplayName nvarchar(60); SET @expectedDisplayName = N'blah_user -'; SELECT @actualDisplayName = (SELECT LocalPartyName FROM @tempTable WHERE RecordingId = N'a0ca1a4f-0933-4565-b5da-d8fc3222a42a'); IF @actualDisplayName <> @expectedDisplayName RAISERROR('Expected display name to be "%s" but it was "%s"', 15, 1, @expectedDisplayName, @actualDisplayName); GO
In the future we could have a script test runner that keeps track of the output and flags errors as part of the build… but one step at a time, right? This already takes test scenarios that took more than a day to set up and makes them runnable in a minute or two.
Database Publishing Wizard can also generate script to create the entire database schema, but it’s the data generation that will support my unit test needs.
Thanks to Pinalkumar Dave for his helpful write-up titled Generate Script with Data from Database – Database Publishing Wizard
(If you’d rather skip the narrative and go straight to the solution, please see the Solution heading…)
We use virtual machines for a lot of our testing — you just clone a VMWare LabManager image, deploy it, and off you go. You can do whatever you want to the configuration, because when you’re done you just undeploy and delete the workspace. It’s really nice.
The “clone a LabManager image” step works because someone regularly takes one of the nightlyish builds and installs it to a VM, creating an image suitable for others to clone.
For less generally applicable needs, such as builds of the team branch I’m working in, an image is not always available, but we can take the main branch and apply transformations to it to update it to the team branch.
And now we’ve arrived at the topic of this post. One of the tranformations I commonly find myself needing to apply is a database schema update.
I can copy the SQL Server update scripts to the VM and run MS SQL Server Management Studio on the VM. This is ok, but it requires that I Remote Desktop in to the VM, establish access to the VM’s file shares, and copy files from two different directories… and SQL Server Management Studio is sluggish on the VM.
I would prefer to run SQL Server Management Studio on my local PC, connecting to the VM… but the user account doesn’t have permissions to alter columns or create or drop tables. Do I have to use Remote Desktop after all?
Edit the properties of your user in MS SQL Server Management Studio under Security->Logins (you might need to get on the VM to do this) and add the
sysadmin server role to your user. Now you can create and drop tables, alter columns, etc. from a remote SQL Server Management Studio connection.
In troubleshooting an issue this week involving the storage of Japanese characters to the database, I found out some things.
When I ran a query on the database using MS SQL Server (2005) Management Studio, the row that should have looked like this
instead looked like question marks:
Is it just my glasses?
My reaction in a situation like this is: “Is it really foggy out, or is it just my glasses?” Either the data was being stored incorrectly in the database or I was simply having trouble viewing it.
Here is a dump of the things I learned:
- The question marks indicate that the data was really not stored correctly (data loss).
- If instead, your query results appear as square boxes (▯▯▯▯▯▯), your PC just does not have the language pack installed to view the characters. On Windows XP, go to
Control Panel -> Regional and Language Optionsand check the
Install files for East Asian languagescheckbox.
- One possible cause of your data turning to question marks is if you insert a literal unicode string but do not use the N prefix (e.g., N‘myunicodestring’). Setting it as just ‘myunicodestring’ can cause non-Latin characters to go to question marks.
- Another cause is if your column is not of a unicode type. For instance, on SQL Server, you would want to use one of the n– character column types such as