Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.1k views
in Technique[技术] by (71.8m points)

c - Different int sizes on my computer and Arduino

Im working on a sparetime project, making some server code to an Arduino Duemilanove, but before I test this code on the controller I am testing it on my own machine (An OS X based macbook). I am using ints some places, and I am worried that this will bring up strange errors when code is compiled and run on the Arduino Duemilanove because the Arduino handles ints as 2 bytes, and my macbook handles ints as 4 bytes. Im not a hardcore C and C++ programmer, so I am in a bit of worry how an experienced programmer would handle this situation. Should I restrict the code with a typedef that wrap my own definition of and int that is restricted to 2 bytes? Or is there another way around?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Your best bet is to use the stdint.h header. It defines typedefs that explicitly refer to the signedness and size of your variables. For example, a 16-bit unsigned integer is a uint16_t. It's part of the C99 standard, so it's available pretty much everywhere. See:

http://en.wikipedia.org/wiki/Stdint.h


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...