In my last two posts, I rather glibly talked about SNOMED, OWL, ontologies, Protege, etc. without mentioning any hardware or software issues that sometimes arise.
For most of us, these matters are best understood from our own "trial and error" experiences. Nonetheless, I've added links to a couple of informative anecdotes below: in one, the account of someone computing successfully with 8 dual CPUs and 15 gig RAM; in another, the account of someone computing unsuccessfully having far less resources.
Link 1: Modeling Massive Ontologies (SNOMED) at Kaiser
Link 2: SNOMED OWL On Protege
Footnote 1: Some large OWL files will not load into Protege on a 32-bit computer when to do so requires in excess of 4 Gigs RAM, the theoretical limit of a 32-bit system. In order to get around this limitation, a 64 bit machine is needed. Fortunately, the Linux, Windows, and Mac operating systems are available in 64-bit versions.
Footnote 2: The entire computing industry is moving from 32-bit to 64-bit technology, and it’s easy to see why. While many of today’s computers can hold far more than 4GB of physical memory, the 32-bit applications that run on them can address only 4GB of RAM at a time. 64-bit computing shatters that barrier by enabling applications to address a theoretical 16 billion gigabytes of memory, or 16 exabytes. 64-bit machines can also crunch twice the data per clock cycle, which can dramatically speed up numeric calculations and other tasks.