Generators and resource aquisition/release
One of the neatest things about language lawyers is that they have a keen eye for features of a language that may conflict with each other to produce fail. I, on the other hand, find it fun to stumble around in various languages and analyze interesting cases as I encounter them.
Generators in Python were a subset of a more general concept of coroutines. Generators are an elegant and concise way to write reasonably sized state machines. For that reason, you'll seem them heavily associated with iterators (which are more sytaxerific [*] to write in a language without generators, like Java).
I used to envision generators as little stack frames that were detached from the call stack and placed somewhere in outer space, eating moon cheese and playing with the Django pony, where they lived happily ever after. Surprisingly, that concept didn't match up with reality too well.
PEP 342: Coroutines via Enhanced Generators and PEP 325: Resource-Release Support for Generators are the language lawyer smack-down of my naive view. We used to be unable to perform proper resource acquisition within generators; notably, you couldn't yield from the try suite of a try/finally block, because the only way to guarantee resource release in the finally block was to step the generator until a StopIteration exception:
Restriction: A yield statement is not allowed in the try clause of a try/finally construct. The difficulty is that there's no guarantee the generator will ever be resumed, hence no guarantee that the finally block will ever get executed; that's too much a violation of finally's purpose to bear.
PEP 255 — Specification: Yield
from threading import Lock lock = Lock() def gen(): try: lock.acquire() yield 'Acquired!' finally: lock.release() if __name__ == '__main__': g = gen() print g.next()
We see the addition of this capability in Python 2.5:
$ python2.4 poc.py File "poc.py", line 8 yield 'Acquired!' SyntaxError: 'yield' not allowed in a 'try' block with a 'finally' clause $ python2.5 poc.py Acquired!
Before Python 2.5 there was no way to tell the generator to die and give up its resources. As PEP 342 describes, Python 2.5 turns generators into simple coroutines, which we can force to release its resources [†] when necessary via the close method:
>>> import poc >>> g = poc.gen() >>> h = poc.gen() >>> g.next() 'Acquired!' >>> g.close() # Force it to release the resource, or we deadlock. >>> h.next() 'Acquired!'
If you're wondering how I came across this combination in day-to-day Python programming, it was largely due to SimPy. I was writing a PCI bus simulation [‡] for fun, to help get a grasp of the SimPy constructs and how they might affect normal object oriented design. [§] I wanted to "acquire" a bus grant, so I analyzed the applicability of with for this resource acquisition.
I went to Stack Overflow and submitted a "feeler" question to see if there was some conventional Python wisdom I was lacking: Is it safe to yield from within a "with" block in Python (and why)?. The concept seemed relatively new to those in the discussion; however, the responses are still insightful.
This experience has demonstrated to me there are two modes of thinking when it comes to Python generators: short-lived and long-lived.
Typical, pre-Python 2.5 generator usage, where generators are really used like generators, lets you glaze over the difference between a regular function and a generator. Really, all that you want to do with this kind of construct is get some values to be used right now. You're not doing anything super-fancy in the generator — it's just nicer syntax to have all of your local variables automatically saved in the generator function than doing it manually in an independent object.
Fancy, SimPy co-routine usage, where generators are managed as coroutines by a central dispatcher, makes a generator take on some more serious object-like semantics. Shared-resource acquisition across coroutine yields should scare you, at least as much as objects that acquire shared resources without releasing them right away. [¶] Perhaps more, seeing as how you're lulled into a state of confidence by understanding short-lived Python generator behaviors.
We can do other things with the new capabilities, like feed values back into the generators:
def gen(): feedback = (yield 'First') yield feedback if __name__ == '__main__': g = gen() assert g.next() == 'First' assert g.send('Test') == 'Test'
The original PCI bus is approximately the "Hello, Word!" of platform architecture, so far as I can tell.
I still haven't gotten solid good grasp of the design methodology changes. If you want one generator to block until the success/failure of another subroutine, then you have to sleep and trigger wake events with the possibility of interrupts. Can you tell I've never used a language with continuations before? ;-)
Deadlocking on mutually exclusive resources is easy with a cooperatively multitasking dispatcher: one entity (coroutine) is holding the resource and yields, dispatcher picks another one that wants that same resource, performs a non-blocking acquisition, and then you have circular wait with no preemption == deadlock.