from Wiktionary, Creative Commons Attribution/Share-Alike License
- n. The act of making things parallel.
- n. The act of making operations work in parallel, particularly in, but not limited to, computing.
from The Century Dictionary and Cyclopedia
- n. The act of parallelizing or making parallel.
Sorry, no etymologies found.
I got a lot of surprised reactions from people who thought that parallelization is this big complicated thing, requiring supercomputers and large funds, and rather large problems, to be worthwhile.
To compensate for my expensive choice, I set everything up for parallel computation from the start using Mathematica 7’s built-in parallelization tools.
We demonstrated a broad range of features, including Mathematica 7’s fully curated chemical, genomic, and proteomic data, built-in parallelization capabilities, and unsurpassed modeling and visualization capabilities.
The process of breaking down a program into threads is called parallelization, and allows computers to run programs very quickly.
Performance tuning processes such as parallelization, caching and indexing
The parallelization here is pretty simple: launch the available kernels, distribute the definitons of the program, and then use ParallelEvaluate to send the image over the link and request the remote creation of the interpolation functions.
Not all computations benefit from parallelization.
Much of the talk was about the need/potential for parallelization via GPUs.
First, the model is easy to use, even for programmers without experience with distributed systems, since it hides the details of parallelization, fault-tolerance, locality optimization and load balancing.
MapReduce allows developers to write applications in their language of choice (Java, C#, Python, C++, R, etc.) while handling the details of parallelization behind the scenes.
Wordnik is becoming a not-for-profit! Read our announcement here.