Upsides of being down — the positive side of depression.
Having spent most of 2006 and 2007 going through this myself, I can, I think, agree. It was unutterably awful, the worst experience of my life bar none (happily, both my parents are still alive) and I would not choose to repeat it but on the other hand, on the other hand, I am, somehow, improved, I think. I don’t know if I’d go so far as to say more resilient but yeah, with a better perspective, less rose tints (but also no yawning chasms of nihilism), and a bit more serious. But don’t worry: not too much more.
To explain why depression has not been “bred out” through Darwinian natural selection, theories have suggested that rather than being a defect, depression could be a defence against the chronic stress that misguided people can put themselves under. It is possible that depression defends us against the tendency to deny our true needs by chasing unobtainable goals and helps to bring these needs into sharper focus. More specifically, the proposed benefits are as follows: removal from a stressful situation, introspection, problem solving, the development of a new perspective, and reintegrating this with the community upon recovery.
On a related but geeky note, it’s very annoying that the otherwise excellent Guardian Unlimited fails (yes, epic) when it comes to search. Go to the front page and search for “upsides of being down”, the title of this article. It’s a hit but you have scroll down a long way to see it; google, on the other hand have it right at the top, on the day it was published — damn, they good! So come on, Simon, sort it out. ;-)
USA Democratic Party “global primary” for Democrats Abroad badly run, insecure, untrustworthy — just like almost all (all?) electronic voting systems in use today.
There are well-known risks at every stage of the episode, so I repeat: that whole process was neither secure nor well-run; moreover, its collection
of personal information using unsecured Web pages exposed participants to the risk of information theft, and delivering notionally secure information
by email is painfully bad judgment. The episode proves nothing except that well-intentioned people continue to make elementary but serious errors in
designing and setting up processes that must be safe at every step if they are to be meaningful.
Don’t like getting to sleep at night? Read the RISKS digest avidly.
… and have done for five years.
For quite a while now I’ve been thinking, and saying, that languages are the central and fundamental modality for computing, ie everything one might want to do in computer science can/should be approached from a linguistic standpoint. I think I’ve just realised that this slightly misses the point, or at least doesn’t say much — because language is central and fundamental to any intellectual (or at least scientific/non-explicitly-sublime) endeavour.
Forget computer science for a moment, step back, and take a look at language.
Language is a serialisation mechanism for ideas and meaning. It exists and is beneficial because serialisation allows those things to be persisted, exchanged, and manipulated.
In natural language, the persistence mechanisms take the form of writing and recordings; the exchange mechanisms are speech if persistence isn’t required as a side effect, but otherwise largely use the persistence mechanisms. Manipulation takes the form of editing and rewriting at the syntactic level, and argument and debate at the semantic level (and propoganda/sociocultural programming at the political level).
Formal languages are central to computer science not because languages per se have anything much to do with computer science, but because formalisation, which means automation and mechanisation, is the very essence of computer science. It is the science of the mechanistic manipulation of data — “the latest stage in mankind’s ongoing quest to automate everything”, as JVT once said. Languages per se are fundamental to computer science only insofar as they are fundamental to all intelligent human endeavour, in their role as a serialisation mechanism for thought. The point with regard to computer science is not that we use language – that is an unavoidable side effect of thinking; the point is that we have to use formal languages, because of the things we choose to think about. As such, computer science is where the species’ expertise on the formal and formalisable aspects of language reside, mainly (colleagues in linguistics may take issue at this, of course, but my personal opinion is that the distinction between natural and formal is here very very deep, or at best that the formalisms behind natural language are intangible). At its heart, computer science is the science of formalisation; the language of such a science must, necessarily, be largely formal.
I guess that’s it. Does this make sense to anyone else? Maybe I’m not saying anything non-obvious. shrug
(This started, by the way, with me thinking about why the textbook on my table, “Languages and Machines“, is subtitled “An Introduction to the Theory of Computer Science”).
Unix tools, and how I use them at Andrew Birkett’s blog, the latest addition to my feedspace.
There’s some really nice stuff here I didn’t know about. cstream, iftop, and less -S are all new to me. watch is, of course, indispensible.
Here are some very interesting notes on a keynote by Alan Cooper (of “The Inmates Are Running The Asylum” fame) at the recent IxDA Interaction08 conference.
I donâ€™t think we ought to be emphasizing innovation; in our industry, it will happen. Innovation means invention, but in the minds of business people, I believe it has come to mean â€œsuccessâ€. But innovation doesnâ€™t mean success (holds up that original clunky MP3 player that failedâ€“â€this was innovative, but was never successfulâ€).
Business people donâ€™t often do best-to-market because they donâ€™t know howâ€“we need to show them. Many have industrial age skills. They view software as mass production. Best-to-market only happens through craftsmanship. Itâ€™s all about qualityâ€“itâ€™s all about getting it right, not to get it fast. Itâ€™s measured by quality, not speed. Itâ€™s a pure measurement, and a delightful one. Craftsmen do it over and over again until they get it right. In their training, they building things over and over so they get the experience they need to get it right.
Programming is not an industrial activity. It is a unique activity that has a lot in common with pre-industrial craft, and yet has a lot of unique characteristics that I call â€œpost-industrial craftâ€. Programs are made one at a time, and each piece is different. Itâ€™s not scalable and itâ€™s not formulaic. Thereâ€™s no magic bullet in pre-industrial craft. We canâ€™t say weâ€™re all going agile and everything will come up roses. Itâ€™s incredibly nuanced and takes years of study to get right.
Programmers are draftsmen. However, they are different than pre-industrial workers. They are self-directed and know better than managers what to do. They respect intelligence, not authority. You canâ€™t tell them what to do, you can only coerce them. Their satisfaction comes from the quality of their work.
But there are no economies of scale is software production, all you can do is reduce the quality that emerges. There are simply no good management tools for software construction. There are no appropriate tools for accounting for the creation of software. Thereâ€™s no way to track any given feature, functionality, or behavior to the amount of money coming.
… and it goes on from there to say (less clearly, alas) what interaction designers can bring to this conflicted situation. Food for thought.
Here’s one of my favourite little logic problems, courtesy of Faron. It’s about time I got this off my noticeboard and onto my interwebs.
Which of the following is true?
- All of the below.
- None of the below.
- All of the above.
- One of the above.
- None of the above.
- None of the above.
Solutions involving exhaustive analysis of the 2^6 possibilities are forbidden, although elegant code doing so will at least be admired for its cleverness, especially if it’s in Haskell. ;-)
Waiting: A Necessary Part of Life — Don Norman on buffers. I think this has possibly the best first sentence of anything I’ve ever read, ever:
Just as dirt collects in crevices, buffers collect in the interfaces between systems.
I’m impressed by the breadth of this. I like the idea that “interface” is a general and fundamental enough concept that the problems we see, and the mental models/tools for thinking about/solving them are essentially similar whether we’re talking about device usability, protocol design, or soft squidgy stuff involving only meat entities.
Problems arise at interface, any interface, be it person and machine, person and person, or organizational unit and organizational unit. Any place where two different entities interact is an interface, and this is where confusions arise, where conflicting assumptions are born and nourished, where synchronization difficulties proliferate as queues form and mismatched entities struggle to engage.
To the analyst, such as me, interfaces are where the fun lies. Interfaces between people, people and machines, machines and machines, people and organizations. Anytime one system or set of activities abuts another, there must be an interface. Interfaces are where problems arise, where miscommunications and conflicting assumptions collide. Mismatched anything: schedules, commuinication protocols, cultures, conventions, impedances, coding schemes, nomenclature, procedures. it is a designer’s heaven and the practitioners hell. And it is where I prefer to be.
Also worth a read: A Fetish for Numbers: Hospital Care.
From the same good people who brought you the lambda calculcus in a can [ultimate].
Fix stuck/dead pixels on your monitor with Killdeadpixel
Awesome — supposedly can often fix stuck pixels with minimal effort on the part of the human. Almost makes me wish I had one to try it out on… ;-)
Reading and listening to Long Now stuff has given me an (at its most optimistic) more skeptical and (at its most pessimistic) more long term non-human-centric view of stories like this, but it’s nonetheless food for thought (excuse the pun): Financial Times: impending food crisis? [i-r-squared via jreighley, randomly via twitter].
Hard red wheat is limit up again (i think thats 9 days out of 11) and is at $19.80 a bushel. When it broke $6 a bushel last summer that was an all time high.
A WFP official, for example, recently showed me the red plastic cup that is used to dole out daily rations to starving Africans â€“ and then explained, in graphically moving terms, that this vessel is typically now only being filled by two-thirds each day, because food prices are rising faster than the WFP budget.
But it’s this that really caught my eye:
Armed police arrested a man listening to his MP3 player and took a sample of his DNA after a fellow commuter mistook the music player for a gun. [smallcool]
Darren Nixon had been waiting at a bus stop in Stoke-on-Trent on his way home from work when a woman saw him reach into his pocket and take out a black Phillips MP3 player. The woman thought it was a pistol and called 999.
Police tracked 28-year-old Nixon using CCTV, sending three cars to follow him. When he got off the bus, armed officers surrounded him. He was driven to a police station, kept in a cell and had his fingerprints, photograph and DNA taken.
The Liberal Democrats, who are campaigning to have the DNA records of innocent people destroyed, said the national DNA database now held more than 3m records kept for life, an estimated 125,000 of which belong to people who were neither cautioned or charged.
Quite an interesting five-part series on database version control. I haven’t used this particular approach myself, but it looks well thought through and sensibly structured, at first blush at least.
Slowing down — awesome video.
New York-based performance art collective Improv Everywhere showcases their latest project, â€œFrozen Grand Centralâ€, which mischievously targeted victims of the Big Appleâ€™s notoriously short now.
Quite so: prototypes and real applications.
Your prototype needs to be written quickly and then it needs to change quickly. You’ll only be able to do that with a maintainable, flexible code base. In short, a well-written code base. You’re a proficient software engineer, you know how to do this. You probably do it without even thinking.
And at some level, everyone knows this. That’s why prototypes are created in languages like Python. A language that you can write quickly, but also write well, quickly.