This SQL Server 2005 T-SQL code:
DECLARE @Test1 varchar;
SET @Test1 = 'dog';
DECLARE @Test2 varchar(10);
SET @Test2 = 'cat';
SELECT @Test1 AS Result1, @Test2 AS Result2;
produces:
Result1 = d
Result2 = cat
I would expect either
- The assignment
SET @Test1 =to fail because there isn’t
'dog';
enough room in@Test1 - Or the
SELECTto return ‘dog’ in the Result1 column.
What is up with @Test1? Could someone please explain this behavior?
Let me answer with some quotes from the SQL Server documentation.
char and varchar
Converting Character Data
So, your varchar is declared as a
varchar(1), and the implicit conversion in yourSETstatement (from a string literal of length 3 to avarchar(1)) truncatesdogtod.