Was inspired to write up a few notes on the SQL Server data type. This is pretty basic stuff, but everyone should know it.
If you declare a column as DECIMAL, it will default to the precision and scale 18, 0. Not very useful, is it? You might as well have used a bigint column to store integer values like that. So first off, when you declare a decimal column, always give it precision and scale.