British0zzy
July 18th, 2008, 09:22 PM
I wrote this program to find the Maximums and Minimums of the constants in the Limits.h header.
Here's the code:
#include <stdio.h>
#include <limits.h>
main()
{
printf("Bits in a char: %+d\n", CHAR_BIT);
printf("Unsigned Character Max: %+d\n", UCHAR_MAX);
printf("Unsigned Character Minimum: 0\n");
printf("Signed Character Maximum: %+d\n", SCHAR_MAX);
printf("Signed Character Minimum: %+d\n", SCHAR_MIN);
printf("Unsigned Integer Maximum: %+ld\n", UINT_MAX);
printf("Unsigned Integer Minimum: 0\n");
printf("Signed Interger Max: %+d\n", INT_MAX);
printf("Signed Integer Minimum: %+d\n", INT_MIN);
printf("Unsigned Short Maximum: %+d\n", USHRT_MAX);
printf("Unsigned Short Minimum: 0\n");
printf("Signed Short Maximum: %+d\n", SHRT_MAX);
printf("Signed Short Minimum: %+d\n", SHRT_MIN);
printf("Unsigned Long Maximum: %+lld\n", ULONG_MAX);
printf("Unsigned Long Minimum: 0\n");
printf("Signed Long Maximum: %+ld\n", LONG_MAX);
printf("Signed Long Minimum: %+ld\n", LONG_MIN);
}
When I run the program here is the output:
Bits in a char: +8
Unsigned Character Max: +255
Unsigned Character Minimum: 0
Signed Character Maximum: +127
Signed Character Minimum: -128
Unsigned Integer Maximum: -1
Unsigned Integer Minimum: 0
Signed Interger Max: +2147483647
Signed Integer Minimum: -2147483648
Unsigned Short Maximum: +65535
Unsigned Short Minimum: 0
Signed Short Maximum: +32767
Signed Short Minimum: -32768
Unsigned Long Maximum: -8079434315340972033
Unsigned Long Minimum: 0
Signed Long Maximum: +2147483647
Signed Long Minimum: -2147483648
I am running this on a MacBook Pro with OSX 10.5.4 (I know... not Ubuntu)
Any thoughts?
Here's the code:
#include <stdio.h>
#include <limits.h>
main()
{
printf("Bits in a char: %+d\n", CHAR_BIT);
printf("Unsigned Character Max: %+d\n", UCHAR_MAX);
printf("Unsigned Character Minimum: 0\n");
printf("Signed Character Maximum: %+d\n", SCHAR_MAX);
printf("Signed Character Minimum: %+d\n", SCHAR_MIN);
printf("Unsigned Integer Maximum: %+ld\n", UINT_MAX);
printf("Unsigned Integer Minimum: 0\n");
printf("Signed Interger Max: %+d\n", INT_MAX);
printf("Signed Integer Minimum: %+d\n", INT_MIN);
printf("Unsigned Short Maximum: %+d\n", USHRT_MAX);
printf("Unsigned Short Minimum: 0\n");
printf("Signed Short Maximum: %+d\n", SHRT_MAX);
printf("Signed Short Minimum: %+d\n", SHRT_MIN);
printf("Unsigned Long Maximum: %+lld\n", ULONG_MAX);
printf("Unsigned Long Minimum: 0\n");
printf("Signed Long Maximum: %+ld\n", LONG_MAX);
printf("Signed Long Minimum: %+ld\n", LONG_MIN);
}
When I run the program here is the output:
Bits in a char: +8
Unsigned Character Max: +255
Unsigned Character Minimum: 0
Signed Character Maximum: +127
Signed Character Minimum: -128
Unsigned Integer Maximum: -1
Unsigned Integer Minimum: 0
Signed Interger Max: +2147483647
Signed Integer Minimum: -2147483648
Unsigned Short Maximum: +65535
Unsigned Short Minimum: 0
Signed Short Maximum: +32767
Signed Short Minimum: -32768
Unsigned Long Maximum: -8079434315340972033
Unsigned Long Minimum: 0
Signed Long Maximum: +2147483647
Signed Long Minimum: -2147483648
I am running this on a MacBook Pro with OSX 10.5.4 (I know... not Ubuntu)
Any thoughts?