The cotton belt, and the Cornish highlands

Two map-related items of interest:

From Pickin’ Cotton to Pickin’ Presidents correlates deep-south counties voting for Obama with cotton production in 1860, in a very striking manner. I found the following rather notable:

As it turns out, president-elect Obama won with the an overall support of 53%, but that includes over 90% of black voters.

Of white voters, only 43% voted for Obama; since Lyndon B. Johnson, no Democratic candidate for the highest office has ever garnered more than half the votes of European-Americans.

Then, comment #96 provides the geological context, expanded upon here, and in particular pointing at this fascinating map of “shorelines in the Cretaceous period”.

You can just see Britain on the right of that map, and ooh look, it’s all underwater apart from part of Scotland, most of Ireland, and south-west England including all of Devon and Cornwall. The most prominent topographical features of Devon and Cornwall these days are Dartmoor and Bodmin Moor, worn down “ancient mountains” as my geography teacher put it to me one day; it looks like, in the cretaceous, they weren’t quite so worn down…

For example, see Maria Full Of Grace (Marston, 2004)

As I’ve been saying for a few years now, there’s no such thing as Fair-Trade cocaine; this may, of course, be freely construed as an argument for legalisation or for greater sanctions, according to your prevailing political worldview.

A quickly-jotted and probably ill-conceived note on language

For quite a while now I’ve been thinking, and saying, that languages are the central and fundamental modality for computing, ie everything one might want to do in computer science can/should be approached from a linguistic standpoint. I think I’ve just realised that this slightly misses the point, or at least doesn’t say much — because language is central and fundamental to any intellectual (or at least scientific/non-explicitly-sublime) endeavour.

Forget computer science for a moment, step back, and take a look at language.

Language is a serialisation mechanism for ideas and meaning. It exists and is beneficial because serialisation allows those things to be persisted, exchanged, and manipulated.

In natural language, the persistence mechanisms take the form of writing and recordings; the exchange mechanisms are speech if persistence isn’t required as a side effect, but otherwise largely use the persistence mechanisms. Manipulation takes the form of editing and rewriting at the syntactic level, and argument and debate at the semantic level (and propoganda/sociocultural programming at the political level).

Formal languages are central to computer science not because languages per se have anything much to do with computer science, but because formalisation, which means automation and mechanisation, is the very essence of computer science. It is the science of the mechanistic manipulation of data — “the latest stage in mankind’s ongoing quest to automate everything”, as JVT once said. Languages per se are fundamental to computer science only insofar as they are fundamental to all intelligent human endeavour, in their role as a serialisation mechanism for thought. The point with regard to computer science is not that we use language – that is an unavoidable side effect of thinking; the point is that we have to use formal languages, because of the things we choose to think about. As such, computer science is where the species’ expertise on the formal and formalisable aspects of language reside, mainly (colleagues in linguistics may take issue at this, of course, but my personal opinion is that the distinction between natural and formal is here very very deep, or at best that the formalisms behind natural language are intangible). At its heart, computer science is the science of formalisation; the language of such a science must, necessarily, be largely formal.

I guess that’s it. Does this make sense to anyone else? Maybe I’m not saying anything non-obvious. shrug

(This started, by the way, with me thinking about why the textbook on my table, “Languages and Machines“, is subtitled “An Introduction to the Theory of Computer Science”).

My eyes, they are burning!

One thing I’ve learnt, to my great surprise, while marking my students’ exams is this:

My handwriting, which I’ve always considered pretty bad, really really could be a hell of a lot worse.

*shudder*

Blogging as writing notes for public consumption

The Reason for Blogging, or rather Josef Svenningsson‘s reason for blogging – but I agree with what he says. In fact, I was only reading that post because he had commented on an earlier post of mine – a post which someone (ooh, dons)has submitted to programming.reddit.com and which was hence received some attention. As such, the following particularly resonated:

But then, why do I blog, as opposed to just writing on a piece of paper? The reason for me is that the possibility that someone might read what I write helps me write. Blogging means that I have an (at least potential) audience which I can target my writing towards. This (perhaps imaginary) audience is very important for me, I wouldn’t be able to write without it. I simply can’t motivate myself to write only for my own sake.

Absolutely. That post of mine about sections started with me just playing around for my own sake, investigating something interesting I’d just come across for the first time. I like to make notes (my memory is lamentably poor), and a blog is a nice way to do that; but as Josef says, once you commit to publishing, you think more carefully about what you write, how it’s structured, etc. Of course, I’m fairly used to writing (sometimes extensive) notes for semi-public consumption in my work, but a blog is nicer than a lecture in that anyone can leave comments and expand my understanding. That happens in lectures sometimes, but rarely, and almost never with random Haskell experts from all over the world. :-)

Another goodie of Josef’s: yak shaving. Yep, been there.

On lies

Clive James on hoaxes & fraudsters. I’m with him on this one.

Some commentators regard fraudsters as romantic types, more interesting than poor old plodding us. These commentators say that few of these frauds would work without our greed. Perhaps not, but none of them would work without the propensity of the fraudster to lie.

Quite right.

Threads should pass messages, not share memory

Highly recommended reading for any of my students out there: a comparison of message-passing concurrency vs. shared-memory concurrency, with a healthy dose of historical perspective. The author introduces Erlang-style concurrency in a Java-ish setting, and does so quite well, to my mind.

Reading the introductory remarks about candidates in interviews, I was pleased, nay, smug to realise that – albeit inadvertantly – I came to multi-threaded programming via the message-passing route, and would probably have made him quite happy if he’d interviewed me. Back when I worked at Frontier I did my first multi-threading work, in Python, and made heavy use of its excellent Queue class for inter-thread communication. Queue provides a thread-safe message passing mechanism, hiding all the nasty details of locking from me, which was exactly what I was looking for. My threads shared almost no state, and what state they did share was mostly Queue objects. They communicated by passing messages through Queues (messages could be anything, and often were), and it was all lovely and clean.

Why did I go down that route? No genius; I just got lucky (yeah, lucky in that I was using Python not Java or C or C++). I had excellent advice from the good folk on comp.lang.python/python-list: this was the way to proceed. Of course, looking back I realise many of these guys knew all about message passing vs shared memory, they knew about Erlang, they knew about Haskell, hell some of them even knew about Lisp. A community as smart and welcoming as that one is a precious resource for a budding programmer.

Anyway, this led to two strongly noticeable results.

First, my code worked well, and didn’t suffer from mysterious hard-to-debug race conditions, etc. It “just worked”, as is often the way with Python.

Second (confession time), I didn’t actually learn properly about semaphores, monitors, shared memory concurrency and all its ridiculous fiddly baggage until I came to teach them in the Operating Systems module at Swansea! By then I’d already formed a strong sense that high-level languages (and Python in particular) made life so much sensibler, so the shared memory stuff slotted quite readily into the mental space of “low level stuff which has to be understood, but is best done by software not humans” (like much of that module).

I was discussing this whole issue with one of my students earlier in the week. If she closed her app’s main window while a worker thread was running, the program would exit uncleanly. This being Python, it was nothing more drastic than an exception/traceback, but she found this properly displeasing and wanted to clean it up (good, I said). It turned out that the main thread wasn’t waiting for the worker to finish: it exited immediately, cleaning up all data, including data the worker was trying to update. Hence, exception city. I showed the simple fix (make the main thread wait for the worker to finish, using a shared boolean “I’m not dead yet” variable), but then I tried to persuade her that message-passing concurrency was the way to go for all inter-thread communication. Even, said I, right down to the (frequent, many) interface updates caused by the worker thread. That is, I suggested, the worker shouldn’t update the GUI component directly, because the GUI is owned by the main thread. Instead, the worker should pass messages to the main thread telling it what updates to perform, and the main thread should poll for these messages and do the updates. I don’t think I sold her on it entirely, but maybe I planted sump’n.

(Caveat: yes, if performance really matters – eg you’re doing volume graphics – this may be poor advice. For the other 95% of us, however…)

Callington museum goes up in flames

Sad day: Mum rang earlier to tell me that Callington museum caught fire this morning. Sadder still, I haven’t ever visited it – for all those years since it opened, I’ve always postponed going because “it’ll be there when I get round to it”. Well, too late Gimbo. Lesson in there for all of us, I’d say.

(Callington is my home town, btw.)

Update 2007-03-05: Dad sent me the following photo:

Callington Museum (burnt out)

Apparently he spoke to one of the firemen, and (I quote) it would appear most of the artifacts were saved, with only smoke & water damage. Cause was apparently an electrical fire at the control box.

Popgadget, and some thoughts on domains

Bash pointed me to popgadget yesterday — cool gadgetry and “stuff”, reported mostly from a geek girl PoV. Some representative goodies: Mmmm… Non-alcoholic malt liquor flavored with beef extracts! · go grammar girl! · your very own space age shower and turkish bath · skateboarding robot (rollerskating, more like?) · awesome inflatable iceberg climbing wall.

Peeve: once upon a time, .net domains were (albeit informally) reserved for people providing network services, eg ISPs, registrars, etc. These days it’s “just another” gTLD which anyone can buy space in, and I think that’s a shame. The appeal of the domain to its purchasers – as far as I can tell – is that you get to say “look! We’re on the net!”. Well, so? I mean, you have a domain, so clearly you’re on the net. Saying .net tells us nothing more about who you are. I guess what I’m trying to get at is: why have a variety of gTLDs if they have no actual meaning? As far as I can tell, the country TLDs have some meaning, but the gTLDs are just one big pool now.

Irony: I’m guilty of this too, having registered both gimbo.org.uk and gimbo.co.uk; in my defence, requests to the latter are rewritten as requests to the former, as I wish to discourage use of the latter, while retaining ownership. But why should I retain ownership? What if some fellow Brit forms a company called “gimbo” and wants the domain? Well, it’s obvious, isn’t it? I’d be rich! Rich! Rich! ;-)