Click to visit our sponsors!

homeGeek CultureWebstoreeCards!Forums!Joy of Tech!AY2K!webcam

  The Geek Culture Forums
  Math-a-holics and Code Junkies
  hexadecimals & memory allocation

Post New Topic  Post A Reply
profile | register | preferences | faq | search

UBBFriend: Email This Page to Someone! next newest topic | next oldest topic
Author Topic:   hexadecimals & memory allocation
donnab
Mini-Geek

Posts: 56
From: Cape Cod
Registered: Jan 2002

posted February 02, 2002 22:41     Click Here to See the Profile for donnab     Edit/Delete Message   Reply w/Quote
Probably a stupid newbie question, but I have always allocated memory to applications in OS9x in increments of 4, with the idea (maybe crazy) that if hexadecimal numeric systems are based on 16 vs 10, then using a number divisible by 4 would be a good idea. Is there any basis for this theory or am I just imagining things?

Maybe calculus does strange things to the brain.

------------------
Donna

IP: Logged

EngrBohn
Highlie

Posts: 686
From: United States
Registered: Jul 2000

posted February 03, 2002 05:16     Click Here to See the Profile for EngrBohn   Click Here to Email EngrBohn     Edit/Delete Message   Reply w/Quote
No such thing as stupid questions, only stupid people who ask them (j/k). We all were newbies at one time or another, so feel free to ask any "stupid newbie questions" you like.

Well, I suppose part of it is context. And the units -- you're allocating memory in increments of 4 what?

The way I parse your question, it sounds to me like you're saying that, from the operating system, you can manually allocate memory for a particular application. If so, wow, I would not have expected that to even be an option on a modern consumer OS (I say that without any offense intended toward MacOS -- it's just that memory management can be handled much better by the OS than by humans, and I'm sure MacOS will do it on its own, and that the feature of being able to override memory management would be something I'd only expect to see on a research OS). But, if you can do this, then I'd suggest allocating memory in multiples of the page size -- modern OSes organize memory in "pages".

If you're talking about allocating memory while programming, then it's a different story. First, why do we use hexadecimal? Because it's a conventient way to short-hand binary. Each hexadigit represents four bits in less written (or verbal) space. Even more convenient, two hexadigits exactly represent the value of one byte, which is handy when we're examining the contents of a memory location. But we use hexadecimal for other uses too, such as addressing, where it's just easier to poke address 0x0A439F30 than to write out (or worse, verbalize) the 32 bits corresponding to that hexadecimal notation.

So, as a programmer, should you allocate memory in units of 4? (Again, 4 what?) I suppose since you're focused on hexadecimal notation, you might be talknig about allocating memory in units of 4 bits. That's just a little too fine-grained. All primitive datatypes on modern computers are multiples of bytes. Yes, you could be working with a bitfield, but as a self-declared newbie, I find that unlikely, and even if you were working with a bitfield, the compiler will allocate memory rounded up at least to byte multiples (if not word multiples).

But let's assume you're not talking about allocating memory at the bit granularity. That is, you're doing something like
int *I = malloc( numelements * sizeof[int] );
Then my suggestion is in two parts. If you really think you're more clever than the compiler, allocate memory at least in multiples of words or better yet, in multiples of cache line sizes.

But let's assume you don't think you're more clever than the compiler (because, trust me, you aren't, and most likely, neither is anyone else on this board -- I can count on one hand the number of people I've met whom I believe could do a better job than the compiler; two hands for the number of people who can do clever things by taking advantage of the compiler's cleverness). Then allocate exactly how much memory you need for that data structure. No more, no less. Why? First, it'll be more maintainable. Second, the compiler will either allocate memory at the appropriate granularity for good performance for that data structure or will take advantage of the memory allocation requests for multple data structures to get a good all-around efficiency. Third (and this is the clincher), if you were to request a certain amount of memory, say 32 bytes, the compiler is actually going to block off more than what you're requesting (perhaps 34 bytes -- there are too many factors involved for me to make a good guess) since it needs a certain amount of overhead for runtime memory management. If you're allocating staticly-sized arrays (e.g., int J[8]; ), then the third factor can be ignored, but the second factor becomes even more prominent.

So my advice: when programming, allocate exactly what you need (unless you know you're Very Clever and you need to be Very Clever and are willing to provide good documentation for future maintainers). At the OS level, let the OS' memory manager do its job and don't try to outsmart it.

------------------
cb
Oooh! What does this button do!?

IP: Logged

nekomatic
Assimilated

Posts: 375
From: Manchester, UK
Registered: Mar 2000

posted February 04, 2002 09:02     Click Here to See the Profile for nekomatic   Click Here to Email nekomatic     Edit/Delete Message   Reply w/Quote
I think donnab is talking about manually changing the memory allocation for each application in Mac OS 9, which can be done in the application's Get Info box. Yes, this may seem old-fashioned to users of other OS's but it has some advantages - if you set a large allocation for your image editor or audio sequencer for example, and launch that app before any others, you can make sure that it doesn't get deprived of RAM by other apps. In any case, lots of today's apps can claim memory dynamically from the OS, over and above the amount you have allocated them. And OS X has transparent memory allocation.

To answer donnab's question, yes I expect that memory is allocated in multiples of a certain size, which is likely to be a power of 2, but I've never worried much about what it is - if it was large enough to be an issue I expect I'd have heard about it somewhere by now.

Then again I have 640 meg in my Powerbook so I can afford not to worry

IP: Logged

quantumfluff
Highlie

Posts: 672
From: the ether
Registered: Jun 2000

posted February 04, 2002 12:56     Click Here to See the Profile for quantumfluff   Click Here to Email quantumfluff     Edit/Delete Message   Reply w/Quote
I second EngrBohn's opinion. Allocate what you need and don't try to second guess the software which is giving you the memory.

In the case of programming, most allocators actually consume more storage than you ask for (for recordkeeping), so rounding up can be detrimental.

In the case of OS9, my guess is that you are allocating pages of memory. Asking for too much just wastes space.

IP: Logged

All times are Pacific Time

next newest topic | next oldest topic

Administrative Options: Close Topic | Archive/Move | Delete Topic
Post New Topic  Post A Reply
Hop to:

Contact Us | Geek Culture Home Page

� 2002 Geek Culture� All Rights Reserved.

Powered by Infopop www.infopop.com © 2000
Ultimate Bulletin Board 5.47e

homeGeek CultureWebstoreeCards!Forums!Joy of Tech!AY2K!webcam