Monday, February 13, 2012

Average

I have a field in my database for the duration of a phonecall. It is input as decimal values of minutes and i have it stored as real datatype... When I use an SQL query to calculate the average i get an in accurate answer... I'm new to databases and don't really know what datatype i should store it as... does anyone have any suggestions?a real data type should work fine, are you tring to store a base 60 number in a base 10 data type? why not convert the decimal into number of seconds (if you need that much resolution) and store the result in an int? Maybe an example would help.|||I'm importing values from a text file into the database. The values in the text files are simple decimals... eg. 0.1, 28.78... and are imported into the db as the same.

Let me try to explain. Its a database that stores the duration of a phone call and the number of the phone from which it is made amongst other fields. One of the phone numbers made two outgoing calls, both 0.1 minutes long. I wrote the query, select avg(duration) .... and it reurned the value 0.100000001490116 for the average!!!

I hope you can shed some light on this for me because as I said, I'm relatively new to databases and I don't know where the problem lies!

Thanks...|||DING DING DING

Float and Real are approximate number data types. What you got was approximately 0.1. Change your data type to Decimal or Numeric probably Decimal(9,4) would work.|||Thanks for you help so far...

Pardon my ignorance but what do you mean by (9,4) in decimal (9,4)?|||from Books Online:

decimal and numeric
Numeric data types with fixed precision and scale.

decimal[(p[, s])] and numeric[(p[, s])]

Fixed precision and scale numbers. When maximum precision is used, valid values are from - 10^38 +1 through 10^38 - 1. The SQL-92 synonyms for decimal are dec and dec(p, s).

p (precision)

Specifies the maximum total number of decimal digits that can be stored, both to the left and to the right of the decimal point. The precision must be a value from 1 through the maximum precision. The maximum precision is 38. The default precision is 18.

s (scale)

Specifies the maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.|||Thanks a million Paul... thats a great help...

No comments:

Post a Comment