Talking about how he has managed to have gone to very different schools of music (From Bossa Nova to Classical) and bringing the mix of all this to different places: I’m a little bit like a a character actor. I put my myself in different situations and none of them are really authentic.
Ethnography is a branch of anthropology and the systematic study of individual cultures. Ethnography explores cultural phenomena from the point of view of the subject of the study. Ethnography is also a type of social research that involves examining the behavior of the participants in a given social situation and understanding the group members’ own interpretation of such behavior.
In authenticity, validation happens through “recognition”—a concept especially elaborated by Hegel, one of the major philosophers of the age of authenticity.
Recognition is “inter-subjective”—it also requires a present peer. But not just anyone—you need another authentic individual to recognize your own authenticity. In authenticity you need an authentic soul mate instead of a sincere “role mate”. You need someone who is as special as you so that he or she in turn can truly recognize how special you are. In both sincerity and authenticity, you rely on present peers to validate your identity: you need to be seen by them.
These present peers, the role mates or soul mates, look at one another directly: They related to one another in the mode of first-order observation. In profilicity, a decisive switch happens. You need to be validated by your peers as well—but, curiously, by peers who are not present.
You need to be validated by a general peer.
Instead of simply being seen by present peers, in profilicity, you are seen as being seen. Think again of the celebrity as an archetype of this identity technology: You cannot relate to celebrities in first-order observation, only in second-order observation, you never see them directly; you see them as being seen.
In profilicity, you adopt the “celebrity mode” for yourself. You not only look at others but show yourself in this mode. You curate an image of yourself as if you were seen as being seen by a general peer. You see yourself as being seen.
Replying to Alan Kay The most important thing, I think, is decoupling “solver time” from “UI time”. Just because the solver iterates through a series of steps over (computer) time doesn’t mean that the appropriate representation is an animation over (human) time. In this case, it’s best to show all the steps simultaneously. This allows you to see the convergence, as well as skim your mouse over a chart to “pick out” particular steps and study them.
Our zeal may have us mistaken for retrograde fundamentalists. We are zealots. We are students of a forgotten past. And we are ever-concerned with the foundational units of our new world-to-be. But we are pragmatic prototypes at heart, and we will submit to great pains to catch a glimpse of what might lie beyond.
Like Mindstorms, I had to give up on extracting key quotes because I found myself transcribing the entire paper.
It’s tempting to judge what you read:
I agree with these statements, and I disagree with those.
However, a great thinker who has spent decades on an unusual line of thought cannot induce their context into your head in a few pages. It’s almost certainly the case that you don’t fully understand their statements.
Instead, you can say:
I have now learned that there exists a worldview in which all of these statements are consistent.
And if it feels worthwhile, you can make a genuine effort to understand that entire worldview. You don’t have to adopt it. Just make it available to yourself, so you can make connections to it when it’s needed.
A way television is quite unprecedented in that in the past audiences were gathered for specific reasons to hear speeches or even to see specific events but television doesn’t do that its job is to gather an audience and it doesn’t really much care uh what it uses as the means to gather an audience
How serious can a flood in Mexico be or an earthquake in in Japan if it is preceded by a Calvin Klein’s jeans commercial and followed by a yogurt commercial? I mean that fact in itself so changes the content of what passes for news that I find it an embarrassment to the very idea of an informed public.
When radio first came on the scene and such a conservative observers is Herbert Hoover who was Secretary of Commerce when the radio act was first passed didn’t believe that it was possible that radio could be used for commercial purposes. He saw it as strictly an educational medium.
We have President Reagan and vice president Mondale in front of the camera and someone like Barbara Walters says first question for you Mr. President is: “what do you think is the solution to the problem in the Middle East?” you will have two minutes to answer after which vice president Mondale will have 60 seconds for rebuttal.
Now, who can take that seriously? If Reagan and Mondale were serious men in fact, they would turn to Miss Walters and say “What kind of men do you think we are? We’re running for the highest office in the land! You can’t answer a question like this in two minutes, nor can you rebut someone else’s answer in 60 seconds!” or they might turn to Miss Walters and say “What kind of people do you think the American public is that they will put up with a forum in which candidates for the presidency are asked to respond to a question like this in two minutes and or one minute?”
But in fact, none of that ever happens. Reagan does answer and Mondale does give his rebuttal and everyone goes on with this charade that television is informing the public
They delude themselves who believe that television and print coexist, for coexistence implies parity. There is no parity here. Print is now merely a residual epistemology, and it will remain so, aided to some extent by the computer, and newspapers and magazines that are made to look like television screens. Like the fish who survive a toxic river and the boatmen who sail on it, there still dwell among us those whose sense of things is largely influenced by older and clearer waters.
We do not see nature or intelligence or human motivation or ideology as “it” is but only as our languages are. And our languages are our media. Our media are our metaphors. Our metaphors create the content of our culture.
Typography fostered the modern idea of individuality, but it destroyed the medieval sense of community and integration. Typography created prose but made poetry into an exotic and elitist form of expression. Typography made modern science possible but transformed religious sensibility into mere superstition. Typography assisted in the growth of the nation-state but thereby made patriotism into a sordid if not lethal emotion.
The form in which ideas are expressed affects what those ideas will be.
The concept of truth is intimately linked to the biases of forms of expression. Truth does not, and never has, come unadorned. It must appear in its proper clothing or it is not acknowledged, which is a way of saying that the “truth” is a kind of cultural prejudice. Each culture conceives of it as being most authentically expressed in certain symbolic forms that another culture may regard as trivial or irrelevant.
It may well be that the development of an American literature was retarded not by the industry of the people or the availability of English literature but by the scarcity of quality paper. As late as Revolutionary days, George Washington was forced to write to his generals on unsightly scraps of paper, and his dispatches were not enclosed in envelopes, paper being too scarce for such use.
Truth, like time itself, is a product of a conversation one has with himself about and through the techniques of communication per has invented.
Karl Marx from The German Ideology. “Is the Iliad possible,” he asks rhetorically, “when the printing press and even printing machines exist? Is it not inevitable that with the emergence of the press, the singing and the telling and the muse cease; that is, the conditions necessary for epic poetry disappear?”
[…] Marx understood well that the press was not merely a machine but a structure for discourse, which both rules out and insists upon certain kinds of content and, inevitably, a certain kind of audience.
Since the male literacy rate in seventeenth-century England did not exceed 40 percent, and as Postman says, in America, was 98% at the same time we may assume, first of all, that the migrants to New England came from more literate areas of England or from more literate segments of the population, or both.6 In other words, they came here as readers and were certain to believe that reading was as important in the New World as it was in the Old.
One significant implication of this situation is that no literary aristocracy emerged in Colonial America. Reading was not regarded as an elitist activity, and printed matter was spread evenly among all kinds of people. A thriving, classless reading culture developed because, as Daniel Boorstin writes, “It was diffuse. Its center was everywhere because it was nowhere: Every man was close to what [printed matter] talked about. Everyone could speak the same language. It was the product of a busy, mobile, public society.”
In every tool we create, an idea is embedded that goes beyond the function of the thing itself. It has been pointed out, for example, that the invention of eyeglasses in the twelfth century not only made it possible to improve defective vision but suggested the idea that human beings need not accept as final either the endowments of nature or the ravages of time. Eyeglasses refuted the belief that anatomy is destiny by putting forward the idea that our bodies as well as our minds are improvable. I do not think it goes too far to say that there is a link between the invention of eyeglasses in the twelfth century and gene-splitting research in the twentieth.
When we were first drawn together as a society, it had pleased God to enlighten our minds so far as to see that some doctrines, which we once esteemed truths, were errors, and that others, which we had esteemed errors, were real truths. From time to time He has been pleased to afford us farther light, and our principles have been improving, and our errors diminishing. Now we are not sure that we are arrived at the end of this progression, and at the perfection of spiritual or theological knowledge; and we fear that, if we should feel ourselves as if bound and confined by it, and perhaps be unwilling to receive further improvement, and our successors still more so, as conceiving what we their elders and founders had done, to be something sacred, never to be departed from.
Richard Hofstadter reminds us, America was founded by intellectuals, a rare occurrence in the history of modem nations. “The Founding Fathers,” he writes, “were sages, scientists, men of broad cultivation, many of them apt in classical learning, who used their wide reading in history, politics, and law to solve the exigent problems of their time.”
On October 16, 1854, in Peoria, Illinois, Douglas delivered a three-hour address to which Lincoln, by agreement, was to respond. When Lincoln’s turn came, he reminded the audience that it was already 5 p.m., that he would probably require as much time as Douglas and that Douglas was still scheduled for a rebuttal. He proposed, therefore, that the audience go home, have dinner, and return refreshed for four more hours of talk. 1 The audience amiably agreed, and matters proceeded as Lincoln had outlined. What kind of audience was this? Who were these people who could so cheerfully accommodate themselves to seven hours of oratory? It should be noted, by the way, that Lincoln and Douglas were not presidential candidates; at the time of their encounter in Peoria they were not even candidates for the United States Senate. But their audiences were not especially concerned with their official status. These were people who regarded such events as essential to their political education, who took them to be an integral part of their social lives, and who were quite accustomed to extended oratorical performances.
Changes in the symbolic environment are like changes in the natural environment; they are both gradual and additive at first, and then, all at once, a critical mass is achieved, as the physicists say. A river that has slowly been polluted suddenly becomes toxic; most of the fish perish; swimming becomes a danger to health. But even then, the river may look the same and one may still take a boat ride on it.
A major new medium changes the structure of discourse; it does so by encouraging certain uses of the intellect, by favoring certain definitions of intelligence and wisdom, and by demanding a certain kind of content—in a phrase, by creating new forms of truth-telling.
The printed book released people from the domination of the immediate and the local; […] print made a greater impression than actual events. […] To exist was to exist in print: the rest of the world tended gradually to become more shadowy. Learning became book-leaning.
Although the general character of print-intelligence would be known to anyone who would be reading this book, you may arrive at a reasonably detailed definition of it by simply considering what is demanded of you as you read this book. You are required, first of all, to remain more or less immobile for a fairly long time. If you cannot do this (with this or any other book), our culture may label you as anything from hyperkinetic to undisciplined; in any case, as suffering from some sort of intellectual deficiency. The printing press makes rather stringent demands on our bodies as well as our minds.Controlling your body is, however, only a minimal requirement. You must also have learned to pay no attention to the shapes of the letters on the page. You must see through them, so to speak, so that you can go directly to the meanings of the words they form. If you are preoccupied with the shapes of the letters, you will be an intolerably inefficient reader, likely to be thought stupid. If you have learned how to get to meanings without aesthetic distraction, you are required to assume an attitude of detachment and objectivity. This includes your bringing to the task what Bertrand Russell called an “immunity to eloquence,” meaning that you are able to distinguish between the sensuous pleasure, or charm, or ingratiating tone (if such there be) of the words, and the logic of their argument. But at the same time, you must be able to tell from the tone of the language what is the author’s attitude toward the subject and toward the reader. You must, in other words, know the difference between a joke and an argument. And in judging the quality of an argument, you must be able to do several things at once, including delaying a verdict until the entire argument is finished, holding in mind questions until you have determined where, when or if the text answers them, and bringing to bear on the text all of your relevant experience as a counterargument to what is being proposed. You must also be able to withhold those parts of your knowledge and experience which, in fact, do not have a bearing on the argument. And in preparing yourself to do all of this, you must have divested yourself of the belief that words are magical and, above all, have learned to negotiate the world of abstractions, for there are very few phrases and sentences in this book that require you to call forth concrete images. In a print-culture, we are apt to say of people who are not intelligent that we must “draw them pictures” so that they may understand.
The weight assigned to any form of truth-telling is a function of the influence of media of communication. “Seeing is believing” has always had a preeminent status as an epistemological axiom, but “saying is believing,” “reading is believing,” “counting is believing,” “deducing is believing,” and “feeling is believing” are others that have risen or fallen in importance as cultures have undergone media change.
Walter Ong points out, in oral cultures proverbs and sayings are not occasional devices: “They are incessant. They form the substance of thought itself. Thought in any extended form is impossible without them, for it consists in them.”
Those who have written vigorously on the matter of TV taking over literacy tell us, […] what is happening is the residue of an exhausted capitalism; or, on the contrary, that it is the tasteless fruit of the maturing of capitalism; or that it is the neurotic aftermath of the Age of Freud; or the retribution of our allowing God to perish; or that it all comes from the old stand-bys, greed and ambition.
In studying the Bible as a young man, I found intimations of the idea that forms of media favor particular kinds of content and therefore are capable of taking command of a culture. I refer specifically to the Decalogue, the Second Commandment of which prohibits the Israelites from making concrete images of anything. “Thou shalt not make unto thee any graven image, any likeness of any thing that is in heaven above, or that is in the earth beneath, or that is in the water beneath the earth.” I wondered then, as so many others have, as to why the God of these people would have included instructions on how they were to symbolize, or not symbolize, their experience. It is a strange injunction to include as part of an ethical system unless its author assumed a connection between forms of human communication and the quality of a culture.
Northrop Frye, who has made use of a principle he calls resonance. “Through resonance,” he writes, “a particular statement in a particular context acquires a universal significance.” 1 Frye offers as an opening example the phrase “the grapes of wrath,” which first appears in Isaiah in the context of a celebration of a prospective massacre of Edomites. But the phrase, Frye continues, “has long ago flown away from this context into many new contexts, contexts that give dignity to the human situation instead of merely reflecting its bigotries.”2 Having said this, Frye extends the idea of resonance so that it goes beyond phrases and sentences. A character in a play or story—Hamlet, for example, or Lewis Carroll’s Alice—may have resonance.
[…] In addressing the question of the source of resonance, Frye concludes that metaphor is the generative force—that is, the power of a phrase, a book, a character, or a history to unify and invest with meaning a variety of attitudes or experiences. Thus, Athens becomes a metaphor of intellectual excellence, wherever we find it; Hamlet, a metaphor of brooding indecisiveness; Alice’s wanderings, a metaphor of a search for order in a world of semantic nonsense.
Understood better than anyone else that the setting down of views in written characters would be the beginning of philosophy, not its end. Philosophy cannot exist without criticism, and writing makes it possible and convenient to subject thought to a continuous and concentrated scrutiny.
The written word is far more powerful than simply a reminder: it re-creates the past in the present, and gives us, not the familiar remembered thing, but the glittering intensity of the summoned-up hallucination.
Mumford points out, with the invention of the clock, Eternity ceased to serve as the measure and focus of human events. And thus, though few would have imagined the connection, the inexorable ticking of the clock may have had more to do with the weakening of God’s supremacy than all the treatises produced by the philosophers of the Enlightenment; that is to say, the clock introduced a new form of conversation between man and God, in which God appears to have been the loser.
Can you imagine, for example, a modem economist articulating truths about our standard of living by reciting a poem? Or by telling what happened to him during a late-night walk through East St. Louis? Or by offering a series of proverbs and parables, beginning with the saying about a rich man, a camel, and the eye of a needle? The first would be regarded as irrelevant, the second merely anecdotal, the last childish. Yet these forms of language are certainly capable of expressing truths about economic relationships, as well as any other relationships, and indeed have been employed by various peoples. But to the modem mind, resonating with different media-metaphors, the truth in economics is believed to be best discovered and expressed in numbers.
The introduction into a culture of a technique such as writing or a clock is not merely an extension of humanity’s power to bind time but a transformation of its way of thinking—and, of course, of the content of its culture. And that is what I mean to say by calling a medium a metaphor. We are told in school, quite correctly, that a metaphor suggests what a thing is like by comparing it to something else. And by the power of its suggestion, it so fixes a conception in our minds that we cannot imagine the one thing without the other: Light is a wave; language, a tree; God, a wise and venerable man; the mind, a dark cavern illuminated by knowledge. And if these metaphors no longer serve us, we must, in the nature of the matter, find others that will. Light is a particle; language, a river; God (as Bertrand Russell proclaimed), a differential equation ; the mind, a garden that yearns to be cultivated.
To people like ourselves any reliance on proverbs and sayings is reserved largely for resolving disputes among or with children. “Possession is nine-tenths of the law.” “First come, first served.” “Haste makes waste.” These are forms of speech we pull out in small crises with our young but would think ridiculous to produce in a courtroom where “serious” matters are to be decided.
People like ourselves may see nothing wondrous in writing, but our anthropologists know how strange and magical it appears to a purely oral people—a conversation with no one and yet with everyone. What could be stranger than the silence one encounters when addressing a question to a text? What could be more metaphysically puzzling than addressing an unseen audience, as every writer of books must do? And correcting oneself because one knows that an unknown reader will disapprove of misunderstand?
“Pro-Capitalism Gaslighting”
Oliver: Real man hire other man mabel.
A “user interface” is simply one type of dynamic picture. I spent a few years hanging around various UI design groups at Apple, and I met brilliant designers, and these brilliant designers could not make real things. They could only suggest. They would draw mockups in Photoshop, maybe animate them in Keynote, maybe add simple interactivity in Director or Quartz Composer. But the designers could not produce anything that they could ship as-is. Instead, they were dependent on engineers to translate their ideas into lines of text. Even at Apple, a designer aristocracy like no other, there was always a subtle undercurrent of helplessness, and the timidity and hesitation that come from not being self-reliant.
It’s fashionable to rationalize this helplessness with talk of “complementary skillsets” and other such bullshit. But the truth is: An author can write a book. A musician can compose a song, an animator can compose a short, a painter can compose a painting. But most dynamic artists cannot realize their own creations, and this breaks my heart.
That’s a really insightful thing to realize what matters from bitter experience right? […] Experience tells you whet to worry about something when not to worry about it. And it’s just nice to see that that’s how you know. I should deal with that before I forget about it.
One of our discoveries is that leveraging the physical world radically reduces complexity. Tasks that might conventionally require “apps” and “codebases” can be done with a few pages of simple, readable programs.
Dynamicland may be the most “open-source” computing environment ever made, because running a program means holding its source literally in your hand. Because programs are small and visible, people pick up the language and become authors while doing other tasks, the way natural languages are learned.
Computing must be reinvented in a form whose inherent complexity is so radically reduced, communities can build their own computing environments, for their own needs, with minimal dependence on vendors, specialists, and centralized production. It must be distributed through ideas and abilities, not products and services.
For there to be any hope of achieving this, we must radically reduce the amount of computing that is necessary in the first place.
Computing is also a medium, the most important medium of the present time and perhaps eventually of all time. A society’s dominant medium structures how people see and understand the world. In the medium of computing, almost all people are illiterate, a disenfranchised underclass which cannot participate in the shaping of their own world.
We believe that true decentralization is grounded in the distribution of ideas and abilities, not products and services. A network of communities who build their own homes, grow their own food, and cook their own meals is decentralized. People living in hotel rooms, with meals delivered to the door, are not.
So as I present each paper, I’m building the structure it describes out of the structure built up for the previous paper.
The unconscious mental processes that produce hysterical symptoms actually go on in the minds of all people at levels of which they are not fully aware. These processes can affect people’s behaviour. Freud realized that, although a patients behaviour could be affected by hypnosis, they often did not recall what had happened during the session. This was the beginning of Freud’s development of psychoanalysis.
Sigi often came top of his class - in fact, he did it six years running, which must have annoyed his classmates considerably. He was obviously pushed to succeed by his family and teachers, and the lives of the whole family revolved around his all-important studies. He had his own room in the crowded home, while all the rest of his siblings had to share. He even ate his evening meal apart from the others, and when his sister Anna’s piano playing distracted him from his studies, his parents had the instrument removed from the apartment.
Positivism is a philosophy which limits knowledge to that which is based on actual sense experience. It attempts to affirm theories by strict scientific investigation.
In academic circles Freud was often seen as opinionated and rather peculiar, so that much of his work was done in what he called ‘splendid isolation’, just as it had been from boyhood. He obviously had outstanding intellect, but by his own admission he had a rather neurotic, obsessive personality and could not imagine a life without work. He wrote incessantly and much of his writing was done on his days off, or even after a busy day seeing his patients.
Freud’s obsessive personality meant that he was the kind of person who has to do everything meticulously and accurately and he liked to be in control. This can be seen in various ways outside of his work. He was very superstitious about certain numbers - for instance, he became utterly convinced that he would die at 61 or 62, because of a series of rather tenuous coincidences to do with odd things like hotel room numbers.
The mechanistic view sees a person as a machine, whose life processes and behaviour are determined by physical and chemical causes. Vitalism takes the opposite viewpoint, saying that life processes cannot be explained by the laws of physics and chemistry alone. Heavily influenced by religious dogma, it assumes that non-material forces are at work in biological processes.
Freud rejected vitalist ideas and, following Brücke’s teaching, he became convinced that all biological processes follow a rigid pattern of cause and effect. This way of thinking - the determinist stance - assumes that even the workings of a person’s mind can be explained by strict physical laws.
Freud had learned Greek, Latin, German, Hebrew, French and English, and by the age of eight he was reading Shakespeare. As if all this wasn’t enough, he also taught himself the rudiments of Spanish and Italian. He went into secondary school a year early and his education there emphasized classical literature and philosophy, which greatly influenced his later thinking and writing. His favorite authors were two of the greatest literary figures of Western Europe - the German writer and philosopher Johann Wolfgang von Goethe (1749-1832) and the poet and English playwright William Shakespeare (1564-1616).
One very important development during the nineteenth century was the formulation of the principle of conservation of energy, put forward by the German physician and physicist Hermann von Helmholtz (1821-94). This principle states that the total amount of energy in any physical system is always constant. Matter can be changed but never destroyed, so that when energy is moved from one place it must always reappear in another. The Helmholtz principle was applied to various branches of physics, such as thermodynamics and electromagnetism, which began to change the world in all kinds of hugely important ways, for example in the introduction of electrical technology. Biology was quick to take on board the new idea as well, and in 1874 the German Ernst Wilhelm von Brücke (1819-92) wrote a book which explained that all living organisms, including human beings, are essentially energy systems, to which the principle of the conservation of energy applies.
Because Freud admired Brücke so much, he took on board this new ‘dynamic physiology’ and arrived at the idea that the human personality is also an energy system and that we therefore have ‘psychic energy’. The role of the psychologist was therefore to study how this energy works within the psyche. This is really the main basis for Freud’s theories of psychoanalysis and he applied the idea in various ways, such as in his theory about sexual repression
Freud emphasized the idea that buried emotions often surface in disguised forms during dreaming, and that working with recalled dreams can help to unearth these buried feelings.
Freud suggested that the repressed emotion was rather like a mental boil, unable to discharge its toxic contents, and so giving rise instead to neurotic symptoms. In the case of hysteria, these symptoms became physical and expressed the patient’s trauma in a symbolic, physical form; hence the term ‘conversion hysteria’. Freud cited a case which provides an example of how repression can be expressed in this way, where a boy’s hand froze when his mother asked him to sign a divorce document that denounced his father.
Josef Breuer befriended Freud and even lent him money. Their relationship became so close that Freud named his first child Mathilde after Breuer’s wife. For some time Freud was quite dependent on Breuer, but eventually a rift occurred when Breuer simply could not agree with Freud’s insistence upon sexual motives for everything. By the time their joint publication Studies on Hysteria was published in 1893, their friendship had already ended. Years later, after Breuer’s death, Freud was greatly moved to find out that his friend had continued to follow his career with great interest even after the rift between them had occurred.
The word bourgeois’ is often used in a derogatory way to refer to the capitalist, non-communist way of thinking, which is assumed to be self-seeking, materialistic, dull and unimaginative.
Occasional interruptions to check understanding don’t usually put the speaker off. In fact, they help the speaker to feel properly listened to. They also further slow the conversation. Reducing the speed at which someone pours out the mixture of ideas and emotions and words and memories and possibilities swirling around in their head helps them to express it more clearly. Often simply saying it out loud helps the person to review the whole situation and see a new way to deal with it.
Remember: you don’t need to know what to say. Listen fully, and don’t be distracted by wondering what to say next […] Just listen. Trust yourself; just like feet move in rhythm to unfamiliar music, when it is time to speak words will come to you, and they will come from your heart instead of your head. The task is not to solve, it is simply to listen.
We can check our understanding of the speaker’s emotions using questions like ‘Is telling me this making you feel sad?’ or ‘Do you feel angry while you are remembering this?’ Or we can name the emotion we observe, and check our observation for accuracy: ‘You sound excited about this. Are you?’ or ‘To me, this all sounds quite scary. Does it make you feel anxious?’ Offering our interpretation of the emotions we are observing may give the speaker pause for thought: they may have been so caught up in their emotions that they have not processed them.
Asking the other person to summarise from time to time is another useful way to share responsibility in a conversation, and it is particularly important if they are receiving new information as the discussion progresses.
Think of listening as a waltz, a dance that progresses in triple time: question, question, check; question, question, summary. Without frequent recaps and summaries to check understanding, it is easy to drift into believing that we have understood the story when in fact we have formed inaccurate assumptions about what we have heard.
These worlds that they’re inhabiting right now were mass manufactured by an app developer somewhere. They’re just taking them for granted. They don’t even have the agency to modify or craft the world they live in.
One of the great joys of working in this environment, which I’ll get to more in a bit, is so much of what you’re doing is affecting other things. So it’s not like I have my own screen, I’m making cool stuff in my screen, but you have to come over to my screen to see it.
It’s more like, I’m making something that affects what’s in front of you. And you’re making things that affect what’s in front of me, or we affect the entire space.
It’s a computer designed specifically around a social context, where people are actually really together.
And it’s designed for people to make their own things, not to take premade stuff off the shelf that was handed to you by benevolent corporate overlords.
And where you can use all of the thousands of capabilities and facilities of being a human being with a human body, as opposed to having all of your interaction reduced to poking at a piece of glass with a finger.
There’s a kind of unspoken assumption in user interface design, maybe unacknowledged. Almost all software is designed with the assumption that the user is alone. That it’s just the software and the user, and the two of them have to solve the world.
[…] If you go back to the original microcomputers like the Apple II, late 70s, early 80s, we’re living in a lot of the design patterns that were established then.
Those computers were made by people who basically wanted to be alone in the basement, just them and the computer, mind-melding with the machine. And we live in the world that they created.
Being able to write on floppies also opened the door to somebody else making a floppy disk, and then selling it to you. You could put this thing in your computer, and suddenly you’re running a program that you didn’t make, but that you bought.
And that’s, again, kind of a peculiar thing. Buying a paint set doesn’t really open the door to enjoying somebody else’s art. They’re very different things. Buying a piano doesn’t help you play mp3s. These are, in most circumstances, very different things.
But in the computer, there’s this kind of pun. And that led directly to a consumer software industry. That was kind of new thing. There had been commercial software for many decades, for banks and institutions and the military, but this was the first time that software was being sold to the mass market.
So it led to a consumer software industry, it led to a class of professional programmers, which led to the assumption that programming was a profession. That learning to program was vocational training, and what you did with your programming skill was, you made things to sell.
If you look at modern programming tools like Xcode and Visual Studio, they’re much more polished than what we had on the Apple II, but they’re all built with the assumption that you’re making things to sell to other people, rather than making things for your own needs.
‘See you tonight? Seven-thirty outside the cinema?’ ‘Yes, fine, see you later!’ But if there’s more than one cinema, or I’m not certain how long it will take me to get there, I may need a pause to give me thinking time before I say, ‘I’m not sure I can be there by seven-thirty. What time does the film start?’ Now you need some thinking time, too: to remember the start time, or to calculate whether the advertisements before the film will allow us some leeway, or to remind yourself if there’s a later showing of the same film. The presence of silences during the conversation has the effect of slowing everything down. Slowing down allows us to focus better on what is being said, and for many people this slowing also reduces any anxiety they felt coming into a conversation that might be important, emotional or long-awaited. […] Remain aware, though, that an ‘expectant silence’ can seem threatening to someone who does not feel ready to explore their uncomfortable thoughts:
I and just about every designer of Common Lisp and CLOS has had extreme exposure to the MIT/Stanford style of design. The essence of this style can be captured by the phrase the right thing. To such a designer it is important to get all of the following characteristics right:
Simplicity — the design must be simple, both in implementation and interface. It is more important for the interface to be simple than the implementation.
Correctness — the design must be correct in all observable aspects. Incorrectness is simply not allowed.
Consistency — the design must not be inconsistent. A design is allowed to be slightly less simple and less complete to avoid inconsistency. Consistency is as important as correctness.
Completeness — the design must cover as many important situations as is practical. All reasonably expected cases must be covered. Simplicity is not allowed to overly reduce completeness.
I believe most people would agree that these are good characteristics. I will call the use of this philosophy of design the MIT approach. Common Lisp (with CLOS) and Scheme represent the MIT approach to design and implementation.
The worse-is-better philosophy is only slightly different:
Simplicity — the design must be simple, both in implementation and interface. It is more important for the implementation to be simple than the interface. Simplicity is the most important consideration in a design.
Correctness — the design must be correct in all observable aspects. It is slightly better to be simple than correct.
Consistency — the design must not be overly inconsistent. Consistency can be sacrificed for simplicity in some cases, but it is better to drop those parts of the design that deal with less common circumstances than to introduce either implementational complexity or inconsistency.
Completeness — the design must cover as many important situations as is practical. All reasonably expected cases should be covered. Completeness can be sacrificed in favor of any other quality. In fact, completeness must be sacrificed whenever implementation simplicity is jeopardized. Consistency can be sacrificed to achieve completeness if simplicity is retained; especially worthless is consistency of interface.
Early Unix and C are examples of the use of this school of design, and I will call the use of this design strategy the New Jersey approach. I have intentionally caricatured the worse-is-better philosophy to convince you that it is obviously a bad philosophy and that the New Jersey approach is a bad approach.
A further benefit of the worse-is-better philosophy is that the programmer is conditioned to sacrifice some safety, convenience, and hassle to get good performance and modest resource use. Programs written using the New Jersey approach will work well both in small machines and large ones, and the code will be portable because it is written on top of a virus.
It is important to remember that the initial virus has to be basically good. If so, the viral spread is assured as long as it is portable. Once the virus has spread, there will be pressure to improve it, possibly by increasing its functionality closer to 90%, but users have already been conditioned to accept worse than the right thing. Therefore, the worse-is-better software first will gain acceptance, second will condition its users to expect less, and third will be improved to a point that is almost the right thing. In concrete terms, even though Lisp compilers in 1987 were about as good as C compilers, there are many more compiler experts who want to make C compilers better than want to make Lisp compilers better.
In Realtalk you make things. You don’t download apps. You make things out of stuff that other people have left lying around. And there’s intended to be this very gentle slope from, you start out playing with things that are lying around. But then you can easily just make little tweaks. So you change a color from green to red. Or you put your name in there. Or you just grab two things other people made and put them together.
Then it’s this gradual progression from tweaking and remixing, to taking on more substantial projects. And that extends all the way down to the system itself, which is just as visible and tangible as everything else. […] The community just comes up and puts things on this whiteboard.
There’s no distinction between OS and user. It’s just basically the stuff up here has a wider scope than anything else. But anybody can go up and grab the compiler and change the compiler. And everybody is now using your new compiler.
Here’s an illustration. What you see here on the left is the first instance of the pop-up menu. This is in Smalltalk in the early 70s. What you see on the right is the first commercialization of the menu.
These two objects have completely different purposes. These two objects have nothing to do with each other. But they look alike. And because we’ve grown up with descendants of that thing on the right, we can’t actually see the thing on the left for what it actually is.
To understand the thing on the left, you have to understand that Alan Kay was trying to create a medium in which all people would be literate in modeling and simulating the complex systems of the world. And he thought a good way of organizing those models might be as these computational objects that are sending messages to each other. So that’s where “object-oriented” came from.
And so, how do you send a message to an object? Well, one way is to write it out. You can give the name of the object and give the message. And that was the Smalltalk programming language. And that’s a powerful context because you can send many messages, you can abstract over them, you can define your own messages.
But Alan Kay was trying to create a new form of literacy, which meant that it had to extend down to the world of young children. Young children had to grow up immersed in this computational medium in the same way that children today grow up immersed in a world of written text. And young children aren’t particularly fluent in abstract language. But they are good at using their hands. Well, is there a way to send a message to an object using your hands? Well, maybe. You use this mouse thing to click on the object, and the object then reveals its vocabulary to you. “Here are the messages that I know how to deal with.” And these are the exact same messages you can then use in the programming language.
They’re the same vocabulary, whether you choose from the menu or whether you write it out. And so when a child clicks on that menu on the left, they’re doing two things. One is that they’re actually invoking that action. But the second thing is that they’re learning this vocabulary. They’re learning how to communicate with this object, and they can then use that vocabulary later, when they’re ready, in the programming context, to make things of their own, to go beyond what’s on the menu.
So that thing on the left is educational scaffolding. The thing on the left is designed to teach children to program. The thing on the right is access to functionality. You click on the thing on the right, it runs an assembly language subroutine written by a priesthood of programmers who have decided what you can and cannot do. And if you want to do something that’s not on the menu, you buy another computer.
So that thing on the left is designed to expand your world, expand your possibilities. The thing on the right constricts your possibilities. The thing on the left is designed to teach you to become self-sufficient, to go beyond the menu. The thing on the right teaches you to be dependent on a corporation for all your needs. And that ideology on the right has become so ubiquitous and so widespread that we do live in a world where, if you want to do anything on a computer, you use an app to do it. And the idea on the left, this amazing idea that, the purpose of a user interface is to teach you to go beyond the interface, that idea has been buried.
Alan Kay, who was one of the principals at Xerox PARC and was responsible for Smalltalk, he wanted to create a new form of literacy in which all people could model and simulate the complex systems of the world. “Understand systems are too complicated to think about in classical ways.” The programming languages and operating systems, again, were kind of a byproduct of what was ultimately a completely educational goal.
Describing a memory of people making games in Dynamicland There’s a bunch of people sitting around this table, and they’re all making their little objects. But then they found it was more fun to make objects that messed with other people’s objects. And then people reprogrammed those other people’s objects, and it became this kind of communally-authored stew of software, where nobody even remembered who made what, because everybody made everything.
Engelbart who invented working with text on the screen, and mice, and hypertext, and video conferencing, and all that—the agenda of his research lab was not actually to invent any of that stuff. The purpose of the research lab was to find ways for his own research lab to be more effective at inventing stuff.
Bootstrapping is a process where you’re not trying to make any particular product or technology to put into the world. You’re actually trying to create your own little world. You’re trying to create an environment around yourself where you’re using tools and processes that are qualitatively different from those outside. Chuck Thacker at Xerox PARC who invented the Xerox Alto put it as: “We were spending a lot of money to simulate the future.”
So you can think of it as, the laboratory itself is the experiment. The goal is to create a social environment in which people are living in the future. And that serves as a prototype for how all people may someday be able to do that.
The Patrick and John Collison of Stripe equivalent 20 years from now isn’t going to have a whole lot to work with.
They’re probably not going to see it that way. They’re going to see it as, progress has stalled because all the low-hanging fruit has been taken.
But that’s not how it works. There’s an infinite amount of fruit in computing. You just need the right kind of research environment to cultivate it. And the right kind of research environment that has cultivated an enormous amount of fruit is what Engelbart called a bootstrapping research environment.
The power of that invitation was to give Irene a sense of control. That simple change enabled her to choose when to talk, and she also decided what to do to cheer herself up at the end of the conversation.
A safe place will usually offer some privacy. It might mean freedom from interruption, or somewhere that is familiar and reassuring. It may mean waiting for the person’s supporters to join them, either to listen or to be participants in the conversation, giving the person a sense of security; or it may mean beginning sooner, so they can seize a chance to talk without the pressure of extra people in the room.
Charles replying to Mabel It makes perfect sense, we just can’t make sense of it!
Miller columns (also known as cascading lists) are a browsing/visualization technique that can be applied to tree structures. The columns allow multiple levels of the hierarchy to be open at once, and provide a visual representation of the current location. It is closely related to techniques used earlier in the Smalltalk browser, but was independently invented by Mark S. Miller in 1980 at Yale University.[citation needed] The technique was then used at Project Xanadu, Datapoint, and NeXT.
To get started, we need to bear in mind whether the circumstances are right. ‘Right’ has to mean ‘good enough’, because we’ll never find the perfect moment. If the time is right, and the setting allows us a chance to talk, that is probably about as good as it gets. The style guide has two pieces of wisdom here: the first is that the other person has a right to choose, too.
It’s stories, not rules, that change people.
It doesn’t matter whether it’s asking someone out on a date or talking about our funeral arrangements with our dear ones: sometimes our own emotions and sometimes our concern about theirs hold us back. Finding a way to begin that allows both people in the conversation to feel confident of being respected and heard sets a tone of collaboration for the rest of the discussion.
Now Dorothy is giving a masterclass in dealing with unwelcome news. She is sitting down. Why didn’t I sit down? She has taken Mrs de Souza’s hand in hers and she is stroking Mrs de Souza’s shoulder with her other hand. I know Dorothy has three patients in the observation unit who are all very sick, and that she can’t spend much time here, and yet here she is bending time, extending it by sounding unhurried, making every second count as she focuses her attention on Mrs de Souza.
‘This is very shocking, my love,’ she purrs to Mrs de Souza. ‘Very shocking. Did you know he had a bad heart?’ Mrs de Souza lifts her head and takes a sobbing breath. Dorothy hands her a tissue from the box on the table. Mrs de Souza blows her nose, then says, ‘He’s had a bad heart for years. He was in here with his first heart attack a couple of years ago and we nearly lost him. He’s had more pains recently, that angina pain, and the doctor changed his tablets …’ She trails off.
‘Were you worrying about him?’ asks Dorothy, a question I can see reaching to the weeping woman’s soul.
‘He wouldn’t rest,’ Mrs de Souza sighs. ‘He worked too hard. I told him he was lucky to survive last time.’
‘So you thought he might die last time?’ asks Dorothy, gently, and Mrs de Souza stares into the middle distance, mopping her eyes and nodding.
‘I think we’ve been on borrowed time,’ she whispers. Dorothy waits. ‘He wasn’t well this morning: stressed by something at work, he looked grey and I told him to stay off, but …’ and she shakes her head, weeping more quietly now, sorrow instead of shock, sadness in place of anger.
It is fascinating to watch the way Dorothy has used questions to help Mrs de Souza step from her knowledge of her husband’s heart disease, past his first heart attack, into her recent worries about his health and to her very specific concern this morning. She has built a bridge for Mrs de Souza to walk across, and in answering Dorothy’s questions Mrs de Souza has prepared herself for this unwanted and yet not entirely unexpected moment. She has told Dorothy the Story So Far.
‘I am so sorry, my love,’ says Dorothy. ‘He wasn’t conscious when the ambulance arrived, and his heart was beating very slowly at first and then it stopped. The team did all they could …’ She pauses again, and in that pause I see the path I could have taken: a conversation about the past, the wife’s concerns, her worry today. I was so busy making sure I told her the dreadful news that I didn’t bring her to a place where she could receive it. Dorothy has wound back the story and then brought her, step by step, to this place: now we can move forward a little further.
‘Would you like to come with me to see him?’ asks Dorothy. ‘He’s lying on a bed around the corner, and you can sit with him there if you would like to.
‘Would you like us to contact somebody for you? Your family? A priest? Anyone who can support you here?’
Mrs de Souza says she would like a Catholic priest to be called, and Dorothy takes her by the hand to lead her from the room. As they pass me, Dorothy says, ‘Make us a cup of tea, we’ll be in cubicle three. Bring one for yourself, too.’
Then Dorothy takes Mrs de Souza to sit with her dead husband. When I deliver the tea, Mrs de Souza thanks me Dorothy has reconstructed the whole transaction, skilfully yet simply, by using gentle questions about what Mrs de Souza knew, to help her to recognize that she was already expecting bad news.
Answering to users of UNIX MAN command who notice the command prints “gimmie gimmie gimmie” in 00:30 in the morning The maintainer of man is a good friend of mine, and one day six years ago I jokingly said to him that if you invoke man after midnight it should print “gimme gimme gimme”, because of the Abba song called “Gimme gimme gimme a man after midnight”
‘I can’t find the words.’ Right now, there is quite likely to be a conversation you are trying to avoid. It is probably one that is important to you, but it has a quality of discomfort to it. Perhaps the conversation requires sharing of a difficult truth; enquiring about information that may be life-changing; proposing something that risks rejection; discussing a topic that will unleash strong emotions; consoling someone experiencing sorrow. There is a push-pull of commitment: the need to act and yet the fear of vulnerability. Not just yet. Soon, but not just yet: I will call, or visit, or make that appointment. We stand on the brink, unsure how to begin.
Societies and cultures are not like rocks, unchanging and unchangeable. They move. Western societies and cultures move, and non-Western societies and cultures move—often much faster.
Historically, humans lived in surroundings that didn’t change much. Learning how things worked and then assuming they would continue to work that way rather than constantly reevaluating was probably an excellent survival strategy.
Just 50 years ago, China, India, and South Korea were all way behind where sub-Saharan Africa is today in most ways, and Asia’s destiny was supposed then to be exactly what Africa’s destiny is supposed to be now: “They will never be able to feed 4 billion people.”
For years after the global crash of 2008, the International Monetary Fund continued to forecast 3 percent annual economic growth for countries on Level 4. Each year, for five years, countries on Level 4 failed to meet this forecast. Each year, for five years, the IMF said, “Next year it will get back on track.” Finally, the IMF realized that there was no “normal” to go back to, and it downgraded its future growth expectations to 2 percent. At the same time the IMF acknowledged that the fast growth (above 5 percent) during those years had instead happened in countries on Level 2, like Ghana, Nigeria, Ethiopia, and Kenya in Africa, and Bangladesh in Asia.
You see this everywhere on Levels 2 and 3 across the world. In Sweden, if someone built their house like that, we would think they had a severe planning problem, or maybe the builders had run away. But you can’t generalize from Sweden to Tunisia. The Salhis, and many others living in similar circumstances, have found a brilliant way to solve several problems at once. On Levels 2 and 3, families often do not have access to a bank to put their savings and cannot get a loan. So, to save up to improve their home, they must pile up money. Money, though, can be stolen or lose its value through inflation. So, instead, whenever they can afford them, the Salhis buy actual bricks, which won’t lose their value. But there is no space inside to store the bricks and the bricks might get stolen if they are left in a pile outside. Better to add the bricks to the house as you buy them. Thieves can’t steal them. Inflation won’t change their value. No one needs to check your credit rating. And over 10 or 15 years you are slowly building your family a better home. Instead of assuming that the Salhis are lazy or disorganized, assume they are smart and ask yourself, How can this be such a smart solution?
When a European Union representative in a World Economic Forum in Davos 2007 blamed through a self-evident fact that China emits more CO2 than the USA, and india more than Germany, the Chinese expert becomes angry but keeps silent The Indian expert, in contrast, could not sit still. He waved his arm and could barely wait for the moderator’s signal that he could speak. He stood up. There was a short silence while he looked into the face of each panel member. His elegant dark blue turban and expensive-looking dark gray suit, and the way he was behaving in his moment of outrage, confirmed his status as one of India’s highest-ranking civil servants with many years’ experience as a lead expert at the World Bank and the International Monetary Fund. He made a sweeping gesture toward the panel members from the rich nations and then said loudly and accusingly, “It was you, the richest nations, that put us all in this delicate situation. You have been burning increasing amounts of coal and oil for more than a century. You and only you pushed us to the brink of climate change.” Then he suddenly changed posture, put his palms together in an Indian greeting, bowed, and almost whispered in a very kind voice, “But we forgive you, because you did not know what you were doing. We should never blame someone retrospectively for harm they were unaware of.” Then he straightened up and delivered his final remark as a judge giving his verdict, emphasizing each word by slowly moving his raised index finger. “But from now on we count carbon dioxide emission per person.”
The indian doctor says So your country has become so safe that when you go abroad the world is dangerous for you.
When one of my students asks why the walls of the hostpital in India are not painted She explains that not painting the walls can be a strategic decision in countries on Levels 2 and 3. It’s not that they can’t afford the paint. Flaking walls keep away the richer patients and their time-consuming demands for costly treatments, allowing hospitals to use their limited resources to treat more people in more cost-effective ways.
The gap instinct divides the world into “us” and “them,” and the generalization instinct makes “us” think of “them” as all the same.
Symmetrical elements, even if they are not physically connected to each other, are perceived as though they are. So when we see these two facing square brackets [ ], our mind unconsciously integrates them into one coherent object.
The design principle of proximity […] is about the distance between a control and the object that it affects. The closer a control is to that object, the more we assume there to be a connection between the two.
Progressive disclosure is a technique for managing complexity. The term is used almost exclusively in the context of interaction design. […] Progressive disclosure gradually eases people from the simple to the more complex.
Progressive disclosure is your best friend. By keeping things simple for novices, they are less likely to feel intimidated, overwhelmed, or get themselves into trouble, while more experienced users can quickly reveal the options and actions that they require, that they understand, and that they are capable of using.
Mapping is about designing controls to resemble the objects that they affect. Shades go up, and shades go down. So it makes sense to use a control that mirrors that up and down movement. There is no ambiguity about how to raise or lower the shade. Mapping also relates to how controls are arranged relative to each other. Their order should resemble the configuration of the objects that they affect.
[…] You will often find labels when mapping is unclear. It is a telltale sign. But it is not a particularly good solution. Reading takes time, and it doesn’t help people to memorize the location of controls or how they should interact with them.
Internal consistency is about designing controls to share a similar look and feel, that match each other. Your app’s glyphs should have a consistent visual style. Text in your app should have a limited number of font faces and sizes and colors and so forth. Internal consistency helps to make an app feel cohesive or whole. When everything matches, when everything fits, people are given a deeper sense of a product’s integrity.
We intuitively believe that design choices were deliberate and thoroughly considered. And with very good reason. Being consistent takes self-control and restraint.
If you see a set of switches on the wall, and you know that one of them controls lights, then it’s reasonable for you to assume that every switch is a light switch. If one of them controls the shade, then it would be better to separate it out from the rest.
Regardless of the exact technique that you use, your app’s interface must make clear what action possibilities it affords. If not, people won’t know how to properly interact with it. They’ll interact in ways that your app doesn’t support, and they’ll confuse controls for non-interactive objects.
People are more likely to perceive an affordance when it is related to an action that they are more likely to take.
We are all becoming more comfortable with increasing levels of abstraction. The button, after all, is just a highly abstracted version of a physical, real-world button. All that’s necessary to create a strong connection between the two are the rounding of the corners.
A subtle drop shadow around the slider knob separates it from the track that it is positioned over, and this separation suggests that it can be moved independently. And even that visual cue might not be completely necessary. For most people, simply seeing a filled-in circle over a line is all that is needed to express a sliding affordance.
And sometimes affordance is communicated using animation. Tapping on the Weather app, we see it slide up a little bit. This suggests the possibility that the content areas can be scrolled. And sure enough, they can.
The 80/20 rule is a design principle that says in essence 80 percent of a system’s effects come from 20 percent of its causes. For an app, this might mean that 80 percent of its benefit comes from 20 percent of the actions that it presents, or 80 percent of the people who use an app are only going to use 20 percent of its functions. Now, the exact percentages are, of course, different. But the basic point is valid.
That traditional idea is that things start with a pidgin, which Wikipedia describes as “a grammatically simplified means of communication that develops between two or more groups of people that do not have a language in common: typically, its vocabulary and grammar are limited and often drawn from several languages”
A creole language, or simply creole, is a stable natural language that develops from the process of different languages simplifying and mixing into a new form (often a pidgin), and then that form expanding and elaborating into a full-fledged language with native speakers, all within a fairly brief period.
When the journalist says with a sad face, “in times like these,” will you smile and think that she is referring to the first time in history when disaster victims get immediate global attention and foreigners send their best helicopters? Will you feel fact-based hope that humanity will be able to prevent even more horrific deaths in the future?
The image of a dangerous world has never been broadcast more effectively than it is now, while the world has never been less violent and more safe.
In the deepest poverty you should never do anything perfectly. If you do you are stealing resources from where they can be better used.
In 2016 a total of 40 million commercial passenger flights landed safely at their destinations. Only ten ended in fatal accidents. Of course, those were the ones the journalists wrote about: 0.000025 percent of the total. Safe flights are not newsworthy.
[…] Over the last 70 years. Flying has gotten 2,100 times safer.
Talking about the 1600 people who died after tsunami damaged the Fukushima nuclear plant These 1,600 people died because they escaped. They were mainly old people who died because of the mental and physical stresses of the evacuation itself or of life in evacuation shelters. It wasn’t radioactivity, but the fear of radioactivity,
Natural disasters (0.1 percent of all deaths), plane crashes (0.001 percent), murders (0.7 percent), nuclear leaks (0 percent), and terrorism (0.05 percent). None of them kills more than 1 percent of the people who die each year, and still they get enormous media attention.
Chemophobia also means that every six months there is a “new scientific finding” about a synthetic chemical found in regular food in very low quantities that, if you ate a cargo ship or two of it every day for three years, could kill you. At this point, highly educated people put on their worried faces and discuss it over a glass of red wine. The zero-death toll seems to be of no interest in these discussions. The level of fear seems entirely driven by the “chemical” nature of the invisible substance.
The key in making great and growable systems is much more to design how its modules communicate rather than what their internal properties and behaviors should be. Think of the internet – to live, it (a) has to allow many different kinds of ideas and realizations that are beyond any single standard and (b) to allow varying degrees of safe interoperability between these ideas.
The earliest traces of mathematical knowledge in the Indian subcontinent appear with the Indus Valley Civilization (c. 4th millennium BCE ~ c. 3rd millennium BCE). The people of this civilization made bricks whose dimensions were in the proportion 4:2:1, which is favorable for the stability of a brick structure. They also tried to standardize measurement of length to a high degree of accuracy. They designed a ruler—the Mohenjo-daro ruler—whose unit of length (approximately 1.32 inches or 3.4 centimeters) was divided into ten equal parts. Bricks manufactured in ancient Mohenjo-daro often had dimensions that were integral multiples of this unit of length.
The oral tradition of preliterate societies had several features, the first of which was its fluidity. New information was constantly absorbed and adjusted to new circumstances or community needs. There were no archives or reports. This fluidity was closely related to the practical need to explain and justify a present state of affairs.
The Mesopotamian cuneiform tablet Plimpton 322, dating to the eighteenth-century BCE, records a number of Pythagorean triplets (3,4,5) (5,12,13) …, hinting that the ancient Mesopotamians might have been aware of the Pythagorean theorem over a millennium before Pythagoras.
Ancient Egypt’s development of geometry was itself a necessary development of surveying to preserve the layout and ownership of farmland, which was flooded annually by the Nile River.
I have this mantra: “Thought is the Enemy of Flow”
Nyctography is a form of substitution cipher writing created by Lewis Carroll (Charles Lutwidge Dodgson) in 1891. It is written with a nyctograph (a device invented by Carroll) and uses a system of dots and strokes all based on a dot placed in the upper left corner. Using the Nyctograph, one could quickly jot down ideas or notes without the aid of light. Carroll invented the Nyctograph and Nyctography as he was often awakened during the night with thoughts that needed to be written down at once, and didn’t want to go through the lengthy process of lighting a lamp only to have to then extinguish it.
Showing a development screenshot of Macintosh, where in the menubar, instead of the modern command symbol there is an Apple logo behind keys One day Steve Jobs came in and was like “That is our logo! That’s our crown jewel! You can’t just throw that! You now have an Apple farm in every menu!”. So, then we had to go back and think like “what could it was the command key or the feature key?”. Yes, I did try ten commandments! It was stupid!
[…] But um I used to look through books sometimes when I was at a loss for something, and I looked at it in this symbol dictionary and there was that symbol, and it said “feature” and we were think we used to call it the feature key. So, I kind of thought well you could definitely draw it in 16 by 16 and it kind of looked like a cloverleaf highway to me [?!] so, maybe it was like good? you know?
You really want every symbol you make to mean something, and somebody emailed me […] and must have heard me say that I wish that symbol meant more, and he sent me a picture of […] the derivation of that symbol in Scandinavia was that it’s a castle with turrets. So that’s why it’s a good symbol for um, you know “landmarks”, and I just loved that it actually meant something after all.
Vanessa describing why she wanted to throw herself under the train everyday she went to work Maybe the point is, I feel like I was something, and then I was, shaped into, suffocated into being like something else, something less, something smaller. And that’s why I kind of developed the sense that I’m not good enough, and […] there is something wrong with me. Like there is something fundamentally, deeply wrong with me as a human being. And perhaps that is something that I’ve been trying to kind of then fix through these different achievements in life, like go to university, get those results, because I felt that that is my responsibility, as Vanessa, as the person I am.
Many of the rituals and structures of primary education are designed to prepare people for factory labor. That’s why they have bells ringing and you have to get up and you have to move from room to room. And there’s no particular reason you should have to move from room to room.
The interesting question for me is, why are they still doing that? Because it’s not like very many kids going to school are going to be working in factories anymore. My conclusion is that they are preparing us for a life that isn’t going to make a lot of sense.
I have been assisting at three reorganization where people have been losing their jobs, where people have been crying, where they were working since 20 years or more.
So, in the back of my head, I always had the thought: you are giving the best of yourself to this company, but remember one day, they will call you and say, you have to pack your stuff in a week.
Corporations are a grand example of the ‘emperor has no clothes’. It’s all about promise. It’s all about what we’re going to do in the future.
They’re constantly undergoing these change processes, reorganization, restructuring, downsizing, rightsizing. You fire people, you hire people, you shift around the signs, but very little, at the end of the day, actually changes.
When everyone builds in public, no one builds in public.
Let’s admit it, the main purpose of build in public is to attract attention and build a community, so you can keep selling products. But when everyone is doing so (and some doing it exceptionally well), how much attention can you get?
I realized that my texting aversion wasn’t a problem to be solved; it was a worry to be examined.
The problem with good habits, in other words, is that they sacrifice intentionality for efficiency.
Building good habits is kind of like boiling a frog alive. Except you’re the frog. At first, the water feels nice and comfy. It’s your natural habitat. But then, the circumstances keep changing without you noticing, and suddenly you’re stuck in a situation you may not want to be in. In fact, I found that the more I’m immersed in a habit and the better it sounds on paper, the less likely I notice the rising heat.
“Life is daily; today is all we have” — when I truly think about the implications of that insight, I get scared. I suppose that’s precisely the reason why an entire industry has been built around habit-building. Building good habits gives us the impression that life is not, in fact, messy, that we don’t need to die soon, and that we are perfectly in control and on top of things. Habits promise control, stability, consistency.
Alas, this is an illusion.
The more I’ve thought about it, the more I realized that habits are nothing but death deniers, faint quests for immortalization. Ultimately, life is daily, and how we spend our days is how we spend our lives.
As shared by Masoud The only way to deal with an unfree world is to become so absolutely free that your very existence is an act of rebellion.
With averages we must always remember that there’s a spread.
Warning: Objects in Your Memories Were Worse Than They Appear
Melinda Gates runs a philanthropic foundation together with her husband, Bill. They have spent billions of dollars to save the lives of millions of children in extreme poverty by investing in primary health care and education. Yet intelligent and well-meaning people keep contacting their foundation saying that they should stop. The argument goes like this: “If you keep saving poor children, you’ll kill the planet by causing overpopulation.”
The goal of higher income is not just bigger piles of money. The goal of longer lives is not just extra time. The ultimate goal is to have the freedom to do what we want.
Factfulness is … recognizing when a story talks about a gap, and remembering that this paints a picture of two separate groups, with a gap in between. The reality is often not polarized at all. Usually the majority is right there in the middle, where the gap is supposed to be. To control the gap instinct, look for the majority.
We avoid reminding ourselves and our children about the miseries and brutalities of the past. The truth is to be found in ancient graveyards and burial sites, where archeologists have to get used to discovering that a large proportion of all the remains they dig up are those of children. Most will have been killed by starvation or disgusting diseases, but many child skeletons bear the marks of physical violence. Hunter-gatherer societies often had murder rates above 10 percent and children were not spared. In today’s graveyards, child graves are rare.
In the year 1800, roughly 85 percent of humanity lived on Level 1, in extreme poverty. All over the world, people simply did not have enough food. Most people went to bed hungry several times a year. Across Britain and its colonies, children had to work to eat, and the average child in the United Kingdom started work at age ten. One-fifth the entire Swedish population, including many of my relatives, fled starvation to the United States, and only 20 percent of them ever returned. When the harvest failed and your relatives, friends, and neighbors starved to death, what did you do? You escaped. You migrated. If you could.
[…] In 1997, 42 percent of the population of both India and China were living in extreme poverty. By 2017, in India, that share had dropped to 12 percent: there were 270 million fewer people living in extreme poverty than there had been just 20 years earlier. In China, that share dropped to a stunning 0.7 percent over the same period, meaning another half a billion people over this crucial threshold.
[..] Just 20 years ago, 29 percent of the world population lived in extreme poverty. Now that number is 9 percent. Today almost everybody has escaped hell.
Motor vehicle accidents show a similar hump-shaped pattern. Countries on Level 1 have fewer motor vehicles per person, so they do not have many motor vehicle accidents. In countries on Levels 2 and 3, the poorest people keep walking the roads while others start to travel by motor vehicles—minibuses and motorcycles—but roads, traffic regulations, and traffic education are still poor, so accidents reach a peak, before they decline again in countries on Level 4. The same goes for child drownings as a percentage of all deaths.
Alongside all the other improvements, our surveillance of suffering has improved tremendously. This improved reporting is itself a sign of human progress, but it creates the impression of the exact opposite.
No man ever steps in the same river twice, for it’s not the same river and he’s not the same man.
Heisenberg had brilliantly intuited a way of representing the quantum world and asking questions about it using such symbols, while being unaware of matrix algebra. In a few frenetic months, Born, along with Heisenberg and Pascual Jordan, developed what’s now known as the matrix mechanics formulation of quantum physics. In England, Paul Dirac saw the light too when he encountered Heisenberg’s work, and he too, in a series omf papers, independently added tremendous insight and mathematics to the formulation and developed the “Dirac notation” that’s still in use today.
The world used to be divided into two but isn’t any longer. Today, most people are in the middle. There is no gap between the West and the rest, between developed and developing, between rich and poor. And we should all stop using the simple pairs of categories that suggest there is.
You won’t find any countries where child mortality has increased. Because the world in general is getting better.
Only 9 percent of the world lives in low-income countries. And remember, we just worked out that those countries are not nearly as terrible as people think.
Child mortality was highest in tribal societies in the rain forest, and among traditional farmers in the remote rural areas across the world.
Describing his moment of euphoria It was almost three o’clock in the morning before the final result of my computations lay before me […] I could no longer doubt the mathematical consistency and coherence of the kind of quantum mechanics to which my calculations pointed. At first, I was deeply alarmed. I had the feeling that, through the surface of atomic phenomena, I was looking at a strangely beautiful interior, and felt almost giddy at the thought that I now had to probe this wealth of mathematical structures nature had so generously spread out before me. I was far too excited to sleep, and so, as a new day dawned, I made for the southern tip of the island, where I had been longing to climb a rock jutting out into the sea. I now did so without too much trouble, and waited for the sun to rise.
With sprints, there are no breaks, little autonomy, and insufficient time to prepare. No wonder developers today seem more stressed! The process is ill-suited to the nature of their work, and they are powerless to change it. The only remedy is to restore autonomy and professionalism to software development. Let developers control both their craft and their process. Treat them as respected peers, not replaceable cogs in a machine.
Sprints are problematic for the simple fact that they never let up. Sprints are not simply shorter deadlines, encountered sporadically as you move along. They are forever repeating, back-to-back deadlines. Waterfall was structured around genuine deadlines and real-world events that demanded focused attention. You worked hard to get something working, then you were done. High pressure was followed by low pressure. Sprints, on the other hand, are fake deadlines, invented for the sake of a process. Since they are contrived, they have no natural breaks or resting periods. There is no time to breathe, no time to collect yourself.
If you are charged with getting a task done, what proportion of your time ought to be dedicated to actually doing the task? Not 100 percent. There ought to be some provision for brainstorming, investigating new methods, figuring out how to avoid doing some of the subtasks, reading, training, and just goofing off.
Consider, for example, a study with male mice divided into groups: some were subjected to involuntary running periods, while others had the autonomy to control their own exercise. Even though both groups ran the same distance (the amount of running was distance-matched), the mice that were forced to run exhibited distinct signs of stress, fear, and discomfort (poor little fellas!).
Believing programmers to be primarily self-managing, we have come to value:
Flat organizations over hierarchical ones
Decentralized decision-making over centralized control
Equity and/or profit sharing over salary or hourly pay
Choice and self-direction over standardization and central planning
A marketplace of ideas over forced consistency
Voluntary and free collaboration over assigned interactions
Persuasion and natural authority over compulsion and formal positions
Roles and responsibilities over assigned tasks
Direct customer interaction over product organizations
Transparency of corporate information over permission-based visibility
Ad hoc demos of working software over sprints and/or fixed milestones
We recognize that every organization must deploy some of the methods on the right, but we assert that their use should be reduced to a minimum.
Another possibility of obtaining literal accounts is the use of a recording machine, either visible or hidden—a measure which, in my view, is absolutely against the fundamental principles on which psycho-analysis rests, namely the exclusion of any audience during an analytic session.
Not only do I believe that the patient, if he had any reason to suspect that a machine was being used (and the unconscious is very perspicacious), would not speak and behave in the way he does when he is alone with the analyst; but I am also convinced that the analyst, speaking to an audience which the machine implies, would not interpret in the same natural and intuitive way as he does when alone with his patient.
The research by the remarkable plant geneticist Barbara McClintock (1902–1992), showing that corn genes or parts of genes could change location on chromosomes during cell division, was largely ignored. For many years the phenomenon of “jumping genes” was mainly considered an eccentric curiosity or oddity. The findings that some 50 percent of the human genome and as much as 80 percent of the corn genome are composed of what are now called “transposable elements” indicate the significance of McClintock’s research, for which she was belatedly awarded a Nobel Prize in 1983.
The work of the English chemist and crystallographer Rosalind Franklin (1920–1958) provided critical evidence for ferreting out the structure of DNA. James Watson and Francis Crick essentially appropriated Franklin’s work and originally did not even acknowledge the significance of her contribution to “their” discovery. Crick has since died, but Watson, who as a world-famous geneticist really should know better, has become infamous and a pariah within the scientific community for his openly espoused racist and sexist bigotry.
Ada Lovelace is recognized by many as the world’s first computer programmer. But Lovelace’s notes on Babbage’s analytical engine gained little attention when they were originally published in 1843 (under her initials A.A.L.). It wasn’t until they were republished in B.V. Bowden’s 1953 Faster Than Thought: A Symposium on Digital Computing Machines that her work found a much wider audience.
All six primary programmers for the first modern computer, ENIAC, were women—Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Wescoff, Fran Bilas, and Ruth Lichterman. They are most often referred to as “computers” and “the ENIAC Girls.” They too, received little attention at the time they worked; programming was undervalued precisely because it was done almost entirely by women. These women weren’t even invited to the dinner following the announcement that the machine worked in 1946.
The very idea of testing for racial difference is itself racist.
The existence of racism means that people are viewed through a distorted lens, leading to mistaken assumptions regarding cause and effect. For example, assuming that race is a biological reality encourages scientists to search for genetic explanations for the high incidence of diseases such as asthma or hypertension among the African American population. However, the cause for the high incidence of these diseases is racism and the resulting stresses and environmental contamination in African American communities. It is a similar issue for the supposed differences in intellectual abilities between supposed races.
This proves what many of us have suspected all along: boys are genetically inferior when it comes to reading, at least careful reading. Their brains are not wired for words. So stop trying to make excuses for things like guys failing to understand mortgage contracts or IPCC reports on climate science. This is not a social failing; it’s because of evolutionary inheritance. Back in the cave age, males who got absorbed in reading were eaten by sabretooths or something. Pretending that biological differences don’t exist is just Political Correctness, and we know how horrible that is.
No group is innately intellectually or morally superior to another. We are not compelled by our biology to act in antisocial ways—to be greedy, selfish, and competitive, for men to dominate women, for whites to discriminate against people of color. Rather than being written in our genes, racism, sexism, and other forms of discrimination and oppression are the creation of unequal societies. The future is therefore open to the creation of a genuinely equal society.
Studies of genetically identical mice in different environments have shown some surprising results. One group was observed over a three-month period in an intricate environment of great diversity, while another group of control mice were kept in plain cages. Even though the mice were all genetically identical and behaved similarly at the beginning of the experiment, their exploratory behavior was markedly different by the end. Mice that moved around a lot and explored grew new nerve cells in the hippocampus, an area of the brain influenced by environmental complexity. In other words, the brain structure of genetically identical mice changed as a result of their life activity.
A study at Stanford University even shows that although there have been significant gains by women in math and science, scientists are still pictured as overwhelmingly male and white—that is, children tend to depict scientists as male and white, even though the number of female and nonwhite scientists has been steadily increasing. And findings of research on the development of gender stereotypes in children indicates continuing challenges: “Many children assimilate the idea that brilliance is a male quality at a young age. This stereotype begins to shape children’s interests as soon as it is acquired and is thus likely to narrow the range of careers they will one day contemplate.
In nature nothing takes place in isolation. Everything affects and is affected by every other thing, and it is mostly because this manifold motion and interaction is forgotten that our natural scientists are prevented from gaining a clear insight into the simplest things.
In 1978, Charles Goldfarb, the principal developer of GML, joined with other advocates of generalized markup to begin work on SGML. SGML has become a widely used language for describing industrial documents. In 1990, Tim Berners-Lee chose SGML as the basis for his World Wide Web hypertext language, the original HTML. The success of HTML spawned XML, now (February 2005) the favored format for structured data exchange.
PUB is all but forgotten. It is not even mentioned in most histories of Scribe and TeX. But its reinventions, JavaScript and PHP, have become indispensible in the world of web authoring. As such, history has confirmed what Les postulated at least as early as 1971: built-in markup tags will never be able to handle the variety of formatting effects that authors and publishers require. To achieve effects that a markup language designer could not have anticipated, a powerful scripting language should be provided.
In 1963, Jerry Saltzer developed RUNOFF, the first known computer-based system in which an author could insert markup codes into a digital manuscript and run the file through a markup processor to generate a formatted document. Around the same time, inspired by a demo of RUNOFF or a similar program, John Seybold founded a typesetting service bureau, ROCAPPI (1963-70). ROCAPPI is said to have developed powerful editing and layout systems and even a form of generic markup.
RUNOFF spawned Joe Ossanna’s troff/nroff (1973), later enhanced by Brian Kernighan, co-developer of Unix and C, and still popular on Unix systems. IBM’s SCRIPT, released circa 1967, had a similar syntax to RUNOFF. By 1974, SCRIPT had spawned the PUB-like NSCRIPT and Waterloo Script, as well as the generalized markup language, GML.
At least two PUB users reacted to these shortcomings by developing a better language. Brian Reid, then at CMU, developed Scribe for nontechnical users. He implemented the first version (Cafe) entirely in PUB (see “Chapter 9: An Evaluation of the System” on page 110 of Scribe: A Document Specification Language and Its Compiler). Don Knuth developed TeX for authors of mathematical texts.
I developed the Smalltalk Browser, an ancestor of today’s IDE’s (integrated development environments).
Tesler’s Law, also known as The Law of Conservation of Complexity, states that for any system there is a certain amount of complexity which cannot be reduced.
[…] Larry Tesler argues that, in most cases, an engineer should spend an extra week reducing the complexity of an application versus making millions of users spend an extra minute using the program because of the extra complexity. However, Bruce Tognazzini proposes that people resist reductions to the amount of complexity in their lives. Thus, when an application is simplified, users begin attempting more complex tasks.
One of many contributions the Lisa made to the GUI was the dialog box, a vehicle for providing parameters to a modeless command. Rod Perkins designed Lisa dialog boxes. The typical dialog prevented the user from continuing work while it was open. That made it modal. But the widgets within the dialog could be operated in any order, making it locally modeless. And the mode escape was performed in a consistent way, by clicking dismissal buttons that were consistently located and labeled.
Law of Conservation of Complexity: Every system has an irreducible amount of complexity; the only question is, who is going to have to deal with it? The user? The application programmer? Or the platform developer?
During my first week on the job, Bill English asked me to work with another new hire, Jeff Rulifson, to develop a vision of the future of editing. Rulifson and I met several times to brainstorm. When I confided my concerns about the NLS command language, he revealed that he had designed it. He had meant it to serve as a temporary tool for software testing. Engelbart’s team had run usability studies and made incremental improvements, but they had not seriously considered suffix syntax,
[…] Rulifson and I also discussed the use of graphics in interfaces. He had recently read a book about semiotics that defined an icon as a labeled pictogram and mentioned its potential relevance to interactive computing.
We circulated a few versions of a white paper around PARC. It was entitled “OGDEN: An Overly General Display Editor for Non-programmers.” We proposed iconic user interfaces with desks and file cabinets. We also proposed modeless postfix syntax with cut and paste. Rulifson’s willingness to turn the user interface he had designed for NLS on its head made it much easier to get the rest of the POLOS team to consider my proposals.
Next up for me was the page-makeup system that Ginn and Company had requested. I used Smalltalk to build a prototype called Cypress. After the user made a selection, an edit menu would pop up automatically nearby, as on today’s iPhone.
In the 1970s, move and copy were edits that users wanted to perform and cut/copy-paste was a new way for users to perform them. Now these terms have reversed roles. Users don’t say they want to “move” things; they say they want to “cut and paste” them.
In 1973, I joined Xerox Palo Alto Research Center (PARC) as a member of the PARC Online Office System (POLOS) team but spent some of my time working on Smalltalk with Alan Kay’s Learning Research Group. One reason I was interested in working with Kay was that his invention of overlapping windows was motivated by a desire to find alternatives to modes.
A number of interesting software systems were coded in SAIL, including some early versions of FTP and TeX, a document formatting system called PUB, and BRIGHT, a clinical database project sponsored by the National Institutes of Health.
Right now it’s only a notion, but I think I can get the money to make it into a concept, and later turn it into an idea.
When surveyed as to why their projects failed so hard and so often, the number one cause IT managers cited was “lack of user involvement,” with “incomplete requirements” a close second.
My aim is not to teach the method that everyone ought to follow in order to conduct his reason well, but solely to reveal how I have tried to conduct my own.
Apple’s often raw simplicity is intentional. Apple’s iOS icons tell us clearly: This is “the” phone, task, health, notes, book app.
Unix-the-concept was portable because there existed many different variants for different machines, but the implementation was not.
One thing that put NT ahead of contemporary Unix systems is that the kernel itself can be paged out to disk too. Obviously not the whole kernel—if it all were pageable, you’d run into the situation where a resolving kernel page fault requires code from a file system driver that was paged out—but large portions of it are. This is not particularly interesting these days because kernels are small compared to the typical installed memory on a machine, but it certainly made a big difference in the past where every byte was precious.
As much as we like to bash Windows for security problems, NT started with an advanced security design for early Internet standards given that the system works, basically, as a capability-based system. The first user process that starts after logon gets an access token from the kernel representing the privileges of the user session, and the process and its subprocesses must supply this token to the kernel to assert their privileges. This is different from Unix where processes just have identifiers and the kernel needs to keep track of what each process can do in the process table.
The NT kernel has various interrupt levels (SPLs in BSD terminology) to determine what can interrupt what else (e.g. a clock interrupt has higher priority than a disk interrupt) but, more importantly, the kernel threads can be preempted by other kernel threads. This is “of course” what every high-performance Unix system does today, but it’s not how many Unixes started: those systems started with a kernel that didn’t support preemption nor multiprocessing; then they added support for user-space multiprocessing; and then they added kernel preemption. […] So it is interesting to see that NT started with the right foundations from its inception.
NTFS was a really advanced file system for its time even if we like to bash on it for its poor performance (a misguided claim). The I/O subsystem of NT, in combination with NTFS, brought 64-bit addressing, journaling, and even Unicode file names. Linux didn’t get 64-bit file support until the late 1990s and didn’t get journaling until ext3 launched in 2001. Soft updates, an alternate fault tolerance mechanism, didn’t appear in FreeBSD until 1998. And Unix represents filenames as nul-terminated byte arrays, not Unicode.
Other features that NT included at launch were disk stripping and mirroring—what we know today as RAID— and device hot plugging. These features were not a novelty given that SunOS did include RAID support since the early 1990s, but what’s interesting is that these were all accounted for as part of the original design.
Named pipes are a local construct in Unix: they offer a mechanism for two processes on the same machine to talk to each other with a persistent file name on disk. NT has this same functionality, but its named pipes can operate over the network. By placing a named pipe on a shared file system, two applications on different computers can communicate with each other without having to worry about the networking details.
One important piece of the NT executive is the Hardware Abstraction Layer (HAL), a module that provides abstract primitives to access the machine’s hardware and that serves as the foundation for the rest of the kernel. This layer is the key that allows NT to run on various architectures, including i386, Alpha, and PowerPC. To put the importance of the HAL in perspective, contemporary Unixes were coupled to a specific architecture: yes, Unix-the-concept was portable because there existed many different variants for different machines, but the implementation was not.
Even though protobuf and gRPC may seem like novel ideas due to their widespread use, they are based on old ideas. On Unix, we had Sun RPC from the early 1980s, primarily to support NFS. Similarly, NT shipped with built-in RPC support via its own DSL—known as MIDL to specify interface definitions and to generate code for remote procedures—and its own facility to implement RPC clients and servers.
Unix systems have never been big on supporting arbitrary drivers: remember that Unix systems were typically coupled to specific machines and vendors. NT, on the other hand, intended to be an OS for “any” machine and was sold by a software company, so supporting drivers written by others was critical. As a result, NT came with the Network Driver Interface Specification (NDIS), an abstraction to support network card drivers with ease. To this day, manufacturer-supplied drivers are just not a thing on Linux, which leads to interesting contraptions like the ndiswrapper, a very popular shim in the early 2000s to be able to reuse Windows drivers for WiFi cards on Linux.
Aa major goal of NT was to be compatible with applications written for legacy Windows, DOS, OS/2 and POSIX.
This need for compatibility forced NT’s design to be significantly different than Unix’s. In Unix, user-space applications talk to the kernel directly via its system call interface, and this interface is the Unix interface. Oftentimes, but not always, the C library provides the glue to call the kernel and applications never issue system calls themselves—but that’s a minor detail.
Contrast this to NT where applications do not talk to the executive (the kernel) directly. Instead, each application talks to one specific protected subsystem,and these subsystems are the ones that implement the APIs of the various operating systems that NT wanted to be compatible with. These subsystems are implemented as user-space servers (they are not inside the NT “microkernel”).
The thing that makes the I/O subsystem of NT much more advanced than Unix’s is the fact that its interface is asynchronous in nature and has been like that since the very beginning.
For the longest time, Unix only offered the simplistic read/write/execute permission sets for each file. NT, on the other hand, came with advanced ACLs from the get go—something that’s still a sore spot on Unix. Even though Linux and the BSDs now have ACLs too, their interfaces are inconsistent across systems and they feel like an alien add-on to the design of the system. On NT, ACLs work at the object level, which means they apply consistently throughout all kernel features.
Smalltalk was created with the education of adolescents in mind - the iPad thinks of this group as a market segment.
HyperCard also illustrates some of the difficulties that might be responsible for the gradual shift away from Xerox PARC-like open models of personal computing. According to rumor, the developer of HyperCard, Bill Atkinson, allegedly gave the product to Apple in 1987, with the understanding that it would be distributed for free with each Mac. The program was an immediate success. HyperCard produced a tremendous amount of feedback from the community, but since it was a free product, Apple wasn’t sure how much internal resources should be devoted to handling HyperCard development.
HyperCard was, by comparison, much closer to the Dynabook ethos. In a sense, the iPad is the failed “HyperCard Player” brought to corporate fruition
In a world where centralized technology like Google can literally give you a good guess at any piece of human knowledge in milliseconds, its a real tragedy that the immense power of cheap, freely available computational systems remains locked behind opaque interfaces, obscure programming languages, and expensive licensing agreements.
Before the first nuclear test at Los Alamos a man asked me to check some arithmetic he had done, […] When I asked what it was, he said, “It is the probability that the test bomb will ignite the whole atmosphere.” […] The next day when he came for the answers I remarked to him, “The arithmetic was apparently correct but I do not know about the formulas for the capture cross sections for oxygen and nitrogen-after all, there could be no experiments at the needed energy levels.” He replied, like a physicist talking to a mathematician, that he wanted me to check the arithmetic not the physics, and left. I said to myself, “What have you done, Hamming, you are involved in risking all of life that is known in the Universe, and you do not know much of an essential part?” I was pacing up and down the corridor when a friend asked me what was bothering me. I told him. His reply was, “Never mind, Hamming, no one will ever blame you.”
Even modern extensible software like Firefox tends to separate use from extension development - the average user might have no idea that Firefox supports user extension. In HyperCard, these features were “on the surface” of the design.
While the Dynabook was meant to be a device deeply rooted in the ethos of active education and human enhancement, the iDevices are essentially glorified entertainment and social interaction (and tracking) devices, and Apple controlled revenue stream generators for developers. The entire “App Store” model, then works to divide the world into developers and software users, whereas the Xerox PARC philosophy was for there to be a continuum between these two states.
Perhaps seeking a way of turning the HyperCard phenomenon into a revenue stream, Apple eventually transferred HyperCard development to a subsidiary company, which attempted to transform it into a profitable business model. HyperCard was no longer released for free, but a locked down version, capable of playing, but not developing, HyperCard Applications was freely available. The “developer’s edition,” recognizable as just HyperCard, was available for purchase. In an effort to make HyperCard into a business model, Apple inadvertently had separated users into “developers” and “users.” This, combined with the development of work-alike with more features, seemed to destroy HyperCard’s momentum, and, despite later attempts at revival at Apple, the system fell out of use.
An integral part of the Xerox PARC Philosophy was to dismantle the wall between software developers and computer users, to develop systems so easy to program that doing so would be a natural, simple aspect of computer use.
I think the style languages appeal to people who have a certain mathematical laziness to them. Laziness actually pays off later on, because if you wind up spending a little extra time seeing that “oh, yes, this language is going to allow me to do this really, really nicely, and in a more general way than I could do it over here,” usually that comes back to help you when you’ve had a new idea a year down the road. The agglutinative languages, on the other hand, tend to produce agglutinations and they are very, very difficult to untangle when you’ve had that new idea.
One way of looking at languages is in this way: a lot of them are either the agglutination of features or they’re a crystallization of style. Languages such as APL, Lisp, and Smalltalk are what you might call style languages, where there’s a real center and imputed style to how you’re supposed to do everything.
If you look at software today, through the lens of the history of engineering, it’s certainly engineering of a sort—but it’s the kind of engineering that people without the concept of the arch did. Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.
One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.
You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
Corporate buyers often buy in terms of feature sets. But at PARC our idea was, since you never step in the same river twice, the number-one thing you want to make the user interface be is a learning environment.
A lot of the problems that computing has had in the last 25 years comes from systems where the designers were trying to fix some short-term thing and didn’t think about whether the idea would scale if it were adopted.
Once you have something that grows faster than education grows, you’re always going to get a pop culture.
In one anecdote, Kay relates showing a custom system (built in Smalltalk) meant to facilitate non-expert “programming,” to executives from Xerox PARC. This system was a kind of highly advanced programming language meant to make human-machine interaction at a very high level intuitive for non-expert users. At one point during a demonstration, a vice president, after an hour of working with the system, realized he was programming. What they accomplished, then, was a keystone for a software system which Kay felt bridged the gap between the numbers coursing through a CPU somewhere, and human intuitive reasoning.
The adoption of programming languages has very often been somewhat accidental, and the emphasis has very often been on how easy it is to implement the programming language rather than on its actual merits and features. For instance, Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.
I grew up with macro systems in the early ’60s, and you have to do a lot of work to make them work for you—otherwise, they kill you.
I came up with a facetious sunspot theory, just noting that there’s a major language or two every 101⁄2 years, and in between those periods are what you might call hybrid languages.
The death of Smalltalk in a way came as soon as it got recognized by real programmers as being something useful; they made it into more of their own image, and it started losing its nice end-user features.
In fact, let’s not even worry about Java. Let’s not complain about Microsoft. Let’s not worry about them because we know how to program computers, too, and in fact we know how to do it in a meta-way. We can set up an alternative point of view, and we’re not the only ones who do this, as you’re well aware.
There has been much discussion about whether reality TV participants deserve union representation and stronger legal protections. I think the argument against those measures is usually that these people know what they’re signing up for. But that does not mean they are not vulnerable to emotional trauma as part of the experience.
Science is a frustratingly non-linear process. Facts that seemed unquestionable a decade ago can be overturned by new data at any moment.
At the turn of the 20th century, dental researchers noticed that many people living near high-fluoride water sources had splotches on their teeth, a condition called fluorosis. Once scientists began studying the effects of fluoride on dental health, they realized that, while fluoride can damage teeth and bones at high concentrations, drinking water with low levels of fluoride actually prevents tooth decay. Today, many cities add a small amount of fluoride to their tap water — about 0.7 milligrams per liter (mg/L).
AVPlayer is a system player component delivered by Apple. It provides best efficiency, performance and system integrations, but number of playable videos formats is limited. This means for Invidious/Piped videos the maximum resolution you can play with it is 1080p. There’s no way to play higher resolution files as they are not provided in the right formats. Obviously, modern Apple devices are more than capable to hardware decode and play these formats. And in fact, Apple seems to be giving special entitlement to Google that allows them to enable VP9/AV1 formats decoding. Just remember that next time you hear how Apple treats all developers equally.
Solitude is only good when surrounded with people.
Talking about the famous catastrophic electronic short-circuit of LHC The accelerator technicians had made a risk analysis on the joints between electrical cables and concluded that there was only a one-in-ten-thousand chance of failure. The problem is that in the LHC there were a total of ten thousand joints. The vagaries of chance had conspired against them.
Steve Jobs used to run an annual retreat for what he considered the 100 most important people at Apple, and these were not the 100 people highest on the org chart. Can you imagine the force of will it would take to do this at the average company? And yet imagine how useful such a thing could be. It could make a big company feel like a startup.
The costume turned out to be formal morning dress in the mid-nineteenth-century style of Alfred Nobel’s time. As Higgs recalled, “Getting into the shirt alone takes considerable skill. It was almost a problem in topology.”
When in 2008, CERN turned on LHC for the first time and media coverage exceeded that of anyone’s expectation Eurovision told us that at some time in the day there was a total of one billion people watching. It was probably because they thought we were going to blow up the universe.
William Waldegrave became so interested that in 1993 he challenged the British physicists to help him explain the Higgs boson and make the case for funding the LHC during discussions with other cabinet ministers, including the Chancellor of the Exchequer, of an upcoming budget. He offered a bottle of vintage champagne as prize for the best effort.
A winning entry by David Miller, […] cleverly used a political analogy to grab Waldegrave’s attention. He imagined the Higgs field as a crowd of political workers at a cocktail party. Former prime minister Margaret Thatcher played the part of a massless particle that enters the room and encounters the field of acolytes. She tries to traverse the room, but the occupants want to shake her hand. This interrupts her, creating inertia. Her interactions with the gathering have altered her from a flighty massless particle into a massive lumbering one. In similar fashion, a massless particle gains inertia—mass—because of its interactions with the ubiquitous Higgs field.
Ideas were never sold at the market price.
Generative AI is ripping the humanity out of things. Built on a foundation of theft, the technology is steering us toward a barren future.
Navigation. Exploration. Browsing. Surfing. The web was akin to a virtual manifestation of physical space. We even had a word for it: webspace.
It has been estimated that, over the past decade, hundreds of security guards have been arrested for manslaughter or murder. In 2019, California saw two incidents in which Allied-employed guards allegedly knelt on the necks of restrained citizens and killed them. The year before, Allied-employed guards harassed and threatened to fight an unarmed black man at Denver’s Union Station. One guard then led the man into a station bathroom, where he was beaten unconscious and suffered permanent brain damage. Last May, the San Francisco district attorney’s office declined to file charges against an Allied-employed guard who shot and killed an unarmed black man suspected of shoplifting from a Walgreens.
Today’s web browsers want to be invisible, merging with the visual environment of the desktop in an effort to convince users to treat “the cloud” as just an extension of their hard drive. In the 1990s, browser design took nearly the opposite approach, using iconography associated with travel to convey the feeling of going on a journey. Netscape Navigator, which used a ship’s helm as its logo, made a very direct link with the nautical origins of the prefix cyber-, while Internet Explorer’s logo promised to take the user around the whole globe. This imagery reinforced the idea that the web was a very different kind of space from the “real world,” one where the usual laws and taxes shouldn’t apply.
A painter wouldn’t add more red to her painting or change the composition because market data showed that people liked it better. It’s her creative vision; some people might like it, others might not. But it is her creation, with her own rules. The question of “performance” is simply irrelevant. It’s the same thing with the small web.
If the commercial web is “industrial”, you could say that the small web is “artisanal”. One is not better than the other. They serve different needs and both can co-exist in an open web. It would nevertheless be a shame if we only spent time on the commercial web and never got the opportunity to experience the creativity, passion and quirkiness of the small web.
The design of the modern commercial web is also “sanitized”: it is polished, follows conventions and is optimized for efficiency. This is one of the reasons so many websites you go to today look and feel the same. The codes of the commercial web have become so dominant that we have forgotten that the small web still exists and has a completely different priorities.
[…] Modern websites are designed to direct user behavior towards certain goals: a purchase, a click, a share or a sign-up. The words, the colours, the message are tailored to these goals, much like packaging on products in the super market.
A 2008 article called by Smashing Magazine called “10 Principles Of Good Website Design” was the top result when I searched for “good web design” whilst writing this essay. There are many others, some more recent, but they all pretty much say the same thing. The author of this one even clearly begins by stating that “[…] user-centric design has established as a standard approach for successful and profit-oriented web design”.
But the web is not always “profit-oriented” and it certainly does not need to be “user-centric” (and I say this as a UX consultant). If it were, there would be very little creativity and self-expression left. The rich diversity of the web would be reduced to the online equivalent of a massive, orderly, clinical shopping mall meant to drive sales. No, the web can just as well be “author-centered”, hobby-centered or even be dog-centered!
Crows and ravens […] hold grudges against each other, do basic statistics, perform acrobatics, and even host funerals for deceased family members.
Unlike humans, who regularly copy each other’s behavior […] we don’t have much evidence that crows will watch each other and deliberately copy what another crow is doing, However, they will steal each other’s tools—in particular, juvenile crows often steal their parents’ tools when they are young. So it’s possible that young crows learn how to make different types of tools from experience stealing their parent’s tools, using them, remembering what these tools look like, and then trying to create something similar.
For their study, Sarah Jelbert and her colleagues first trained three hooded crows—Glaz (15 years old), Rodya (4 years old), and Joe (3 years old)—to recognize pieces of paper of different sizes and colors. To do this, they exposed the birds to “template” pieces of paper in different colors and sizes for several minutes before removing them—and then rewarded the birds for dropping scraps that matched these templates into a small slit.
The crows were next given the opportunity to manufacture versions of these objects in exchange for a reward. The researchers found that all three crows manufactured objects that matched the original template object they had been rewarded for in both color and size—even though the treats in this second stage of the experiment were awarded at random. The researchers also observed that Glaz, the oldest of the three hooded crows, seemed to be the most proficient at making scraps that looked like the ones the bird was trained on. This finding suggested to them that mental templates may be linked to experience garnered with age.
Labour government, Chancellor of the Exchequer Stafford Cripps imposed a huge tax on tobacco. Peter Higgs’ father Tom Higgs, whom Peter described as a staunch Tory, stated, “I’m not going to contribute to a load of socialists!” and never smoked again.
An arid definition of symmetry in mathematics is a situation where change produces no change. For example, a circle is symmetric under rotation.
[…] but in then exposing the weakest link in an argument and persisting until he broke through to new vistas.
It is easy to be wise once all is understood.
In live programming environments We see code on the left and a result on the right, but it’s the steps in between which matter most.
Learning cooking is not about guessing the functionality of your kitchen appliances. It’s about understanding how ingredients can be combined.
Likewise, guessing the third argument of the “ellipse” function isn’t “learning programming”. It’s simply a barrier to learning.
[…] individuals are drawn to one profession or another partly because of their unconscious pre-disposition or valency for one basic assumption rather than another. As a result, they are particularly likely to contribute to the interdisciplinary group processes without questioning them.
When under the sway of a basic assumption, a group appears to be meeting as if for some hard-to-specify purpose upon which the members seem intently set. Group members lose their critical faculties and individual abilities, and the group as a whole has the appearance of having some ill-defined but passionately involving mission. Apparently trivial matters are discussed as if they were matters of life or death, which is how they may well feel to the members of the group, since the underlying anxieties are about psychological survival.
In this state of mind, the group seems to lose awareness of the passing of time, and is apparently willing to continue endlessly with trivial matters. […] Other external realities are also ignored or denied. For example, instead of seeking information, the group closes itself off from the outside world and retreats into paranoia.
Task-oriented teams have a defined common purpose and a membership determined by the requirements of the task. Thus, in a multidisciplinary team, each member would have a specific contribution to make. Often, the reality is more like a collection of individuals agreeing to be a group when it suits them, while threatening to disband whenever there is serious internal conflict. It is as if participation were a voluntary choice, rather than that there is a task which they must co-operate in order to achieve. The spurious sense of togetherness is used to obscure these problems and as a defence against possible conflicts. Even the conflicts themselves may be used to avoid more fundamental anxieties about the work by preventing commitment to decisions and change.
Talking about a shift from a paranoid-schizoid position to one in which there was a preponderance of depressive anxiety the shift in emotional climate does not, however, result in freedom from anxiety. Instead, our fears of what others are doing to us are replaced by a fear of what we have done to others
Schizoid splitting is normally associated with the splitting off and projecting outwards of parts of the self perceived as bad, thereby creating external figures who are both hated and feared. In the helping professions, there is a tendency to deny feelings of hatred or rejection towards clients. These feelings may be more easily dealt with by projecting them onto other groups or outside agencies, who can then be criticized. The projection of feelings of badness outside the self helps to produce a state of illusory goodness and self-idealization. This black-and-white mentality simplifies complex issues and may produce a rigid culture in which growth is inhibited.
The perceived power or powerlessness counts more than the actual, both of which depend on the inner world connectedness mentioned previously […] an individual in a state of demoralization or depression may well have adequate external resources to effect some change, but feel unable to do so on account of an undermining state of mind. In this case, power is projected, perceived as located outside the self, leaving the individual with a sense of powerlessness. By contrast, someone who attracts projected power is much more likely to take—and to be allowed to take—a leadership role.
The basic disposition of the consultant is important, too. The sense of security in the group is greatly encouraged by the consultant’s restraint from judging and blaming, and ‘knowing’ too much too soon, or seeming to believe in quick solutions.
In a misguided attempt to avoid fanning rivalry and envy, managers may try to manage from a position of equality, or, more commonly, pseudo-equality, often presented as ‘democracy’. The term is used as if everyone has equal authority. The hope is that rivalry, jealousy and envy will thereby be avoided; the reality is the undermining of the manager’s authority, capacity to hold an overall perspective and ability to lead.
True leadership requires the identification of some problem requiring attention and action, and the promotion of activities to produce a solution. In basic assumption mentality, however, there is a collusive interdependence between the leader and the led, whereby the leader will be followed only as long as he or she fulfils the basic assumption task of the group. The leader in baD is restricted to providing for members’ needs to be cared for. The baF leader must identify an enemy either within or outside the group, and lead the attack or flight. In baP, the leader must foster hope that the future will be better, while preventing actual change taking place. The leader who fails to behave in these ways will be ignored, and eventually the group will turn to an alternative leader. Thus the basic assumption leader is essentially a creation or puppet of the group, who is manipulated to fulfil its wishes and to evade difficult realities.
A leader or manager who is being pulled into basic assumption leadership is likely to experience feelings related to the particular nature of the group’s unconscious demands. In baD there is a feeling of heaviness and resistance to change, and a preoccupation with status and hierarchy as the basis for decisions.
In baF, the experience is of aggression and suspicion, and a preoccupation with the fine details of rules and procedures. In baP, the preoccupation is with alternative futures; the group may ask the leader to meet with some external authority to find a solution, full of insubstantial hopes for the outcome.
Members of such groups are both happy and unhappy. They are happy in that their roles are simple, and they are relieved of anxiety and responsibility. At the same time, they are unhappy insofar as their skills, individuality and capacity for rational thought are sacrificed, as are the satisfactions that come from working effectively. As a result, the members of such groups tend to feel continually in conflict about staying or leaving, somehow never able to make up their minds which they wish to do for any length of time.
Contribution of Melanie Klein: In play, children represent their different feelings through characters and animals either invented or derived from children’s stories: the good fairy, the wicked witch, the jealous sister, the sly fox and so on. This process of dividing feelings into differentiated elements is called splitting, By splitting emotions, children gain relief from internal conflicts. The painful conflict between love and hate for the mother, for instance, can be relieved by splitting the mother-image into a good fairy and a bad witch. Projection often accompanies splitting, and involves locating feelings in others rather than in oneself. Thus the child attributes slyness to the fox or jealousy to the bad sister. Through play, these contradictory feelings and figures can be explored and resolved.
[…] Early in childhood, splitting and projection are the predominant defenses for avoiding pain; Klein referred to this as the paranoid-schizoid position (‘paranoid’ referring to badness being experienced as coming from outside oneself, and ‘schizoid’ referring to splitting). This is a normal stage of development; it occurs in early childhood and as a state of mind it can recur throughout life. Through play, normal maturation or psychoanalytic treatment, previously separated feelings such as love and hate, hope and despair, sadness and joy, acceptance and rejection can eventually be brought together into a more integrated whole. This stage of integration Klein called the depressive position, because giving up the comforting simplicity of self-idealization, and facing the complexity of internal and external reality, inevitably stirs up painful feelings of guilt, concern and sadness. These feelings give rise to a desire to make reparation for injuries caused through previous hatred and aggression.
It is only with the provision of a containing environment that the institution can settle down to working at its task. Members need time to get to know each other and their roles in a task-oriented setting; ‘chats’ during coffee break or lunch-time are not sufficient, as they invariably shirk the most difficult issues of the day. It is only with time and ongoing work that staff can reach the important stage—personally, professionally and institutionally—of having the freedom to think their own thoughts, as opposed to following the institutional defensive ‘party line’. Only then will they be able to develop their own style of work, and contribute fully to the task in hand.
In his short but seminal paper, ‘Hate in the Countertransference’, Winnicott (1947) discussed the hate inevitably felt by psychoanalysts for their patients, and by mothers for their babies. He stressed that the capacity to tolerate hating ‘without doing anything about it’ depends on one’s being thoroughly aware of one’s hate. Otherwise, he warned, one is at risk of falling back on masochism. Alternatively, hate—or, in less dramatic terms, uncaring—will be split off and projected, with impoverishment of the capacity to offer good-enough care.
Depressive position is never attained once and for all. Whenever survival or self-esteem are threatened, there is a tendency to return to a more paranoid-schizoid way of functioning.
It was discovered that groups of workers supposedly doing similar jobs in separate coal mines in fact organized themselves very differently, and that this had significant effects on levels of productivity. This led to the concept of the self-regulating work group, and to the idea that differences in group organization reflect unconscious motives, which also affect the subjective experience of the work. It was through this project that the ‘socio-technical system’ came to be defined.
Freud found that there was often resistance to accepting the existence of the unconscious. However, he believed he could demonstrate its existence by drawing attention to dreams, slips of tongue, mistakes and so forth as evidence of meaningful mental life of which we are not aware. What was then required was interpretation of these symbolic expressions from the unconscious. Ideas which have a valid meaning at the conscious level may at the same time carry an unconscious hidden meaning. For example, a staff group talking about their problems with the breakdown of the switchboard may at the same time be making an unconscious reference to a breakdown in interdepartmental communication. Or complaints about the distribution of car-park spaces may also be a symbolic communication about managers who have no room for staff concerns. The psychoanalytically oriented consultant takes up a listening position on the boundary between conscious and unconscious meanings, and works simultaneously with problems at both levels. It may be some time before the consultant can pick up and make sense of these hidden references to issues of which the group itself is not aware.
Freud argued that the members of a group, particularly large groups such as crowds at political rallies, follow their leader because he or she personifies certain ideals of their own. The leader shows the group how to clarify and act on its goals. At the same time, the group members may project their own capacities for thinking, decision-making and taking authority on to the person of the leader and thereby become disabled.
The consultant who undertakes to explore the nature of the underlying difficulty is likely to be seen as an object of both hope and fear. The conscious hope is that the problem will be brought to the surface, but at the same time, unconsciously, this is the very thing which institutions fear.
[…] we may be able to hear only the distress of the molested child, and not the communications about excitement or triumph, which we find more disturbing. The painful story is therefore not fully understood by either, and so gets repeated endlessly.
What needs attention is the listener’s own experiences, or countertransference (see Chapter 1), as the story is told. This conveys the essence of the trauma, how painful it was to be there, and can make it possible to discover the exact nature of the pain. The capacity to hear the message accurately requires the ability to pay attention to all aspects of one’s experience, and depends on many things.
Like individuals, institutions develop defenses against difficult emotions which are too threatening or too painful to acknowledge. […] But some institutional defenses, like some individual defenses, can obstruct contact with reality and in this way damage the staff and hinder the organization in fulfilling its task and in adapting to changing circumstances. Central among these defenses is denial, which involves pushing certain thoughts, feelings and experiences out of conscious awareness because they have become too anxiety-provoking
There are two main dangers in seeking to apply a purely psychoanalytic perspective to institutions. The first is that it may lead to attempts to develop members’ ‘sensitivity’ and insight into their own and the institution’s psychological processes, while ignoring the systemic elements that affect the work. In this case, instead of bringing about useful and needed change, their heightened sensitivity may add to members’ frustration and have a negative effect on the institution.
The second is the risk of what has been called ‘character assassination’, in which psychoanalytic theory is misused to disparage character and impugn motives. This can lead to attributing institutional problems to the individual pathology of one or more of its members. It can also lead to consultants’ undertaking or presenting their work in a way that pathologies the behaviour and functioning of the institution and its individual members without giving due regard to the effectiveness with which the conscious real-world tasks of the organization are being pursued.
Aberrant baD gives rise to a culture of subordination where authority derives entirely from position in the hierarchy, requiring unquestioning obedience. Aberrant baP produces a culture of collusion, supporting pairs of members in avoiding truth rather than seeking it. There is attention to the group’s mission, but not to the means of achieving it. Aberrant baF results in a culture of paranoia and aggressive competitiveness, where the group is preoccupied not only by an external enemy but also by ‘the enemy within’. Rules and regulations proliferate to control both the internal and the external ‘bad objects’. Here it is the means which are explicit and the ends which are vague.
When we recognize that our painful feelings come from projections, it is a natural response to ‘return’ these feelings to their source: ‘These are your feelings, not mine.’ This readily gives rise to blaming, and contributes to the ricocheting of projections back and forth across groups and organizations. However, if we can tolerate the feelings long enough to reflect on them, and contain the anxieties they stir up, it may be possible to bring about change. At times when we cannot do this, another person may temporarily contain our feelings for us.
Whether or not there is an external consultant, it is necessary for members to learn not just to listen to the content of what is brought to the discussion, but also to allow the emotional impact of the communications to work on and inside themselves.
In many work situations, the chief anxiety which needs to be contained is the experience of inadequacy.
The proposal was that it should be ‘to enable patients to live out the remainder of their lives in as full, dignified and satisfying a way as possible’, […] This definition would mean that all the various professionals involved in patient care could see their particular work as contributing to a common purpose, rather than having conflicting and competing aims.
This change in task definition had major implications. It invited re-examination of practices previously taken for granted, such as the nurses’ emphasis on safety as a priority, with its consequent depersonalization and loss of dignity for patients.
According to Wilfred Bion, much of the irrational and apparently chaotic behaviour we see in groups can be viewed as springing from basic assumptions common to all their members. […]
Bask assumption dependency (baD) A group dominated by baD behaves as if its primary task is solely to provide for the satisfaction of the needs and wishes of its members. The leader is expected to look after, protect and sustain the members of the group, to make them feel good, and not to face them with the demands of the group’s real purpose. The leader serves as a focus for a pathological form of dependency which inhibits growth and development. […]
Basic assumption fight-flight (baF) The assumption here is that there is a danger or ‘enemy’, which should either be attacked or fled from. However, as Wilfred Bion puts it, the group is prepared to do either indifferently. Members look to the leader to devise some appropriate action; their task is merely to follow. […]
Basic assumption pairing (baP) BaP is based on the collective and unconscious belief that, whatever the actual problems and needs of the group, a future event will solve them. The group behaves as if pairing or coupling between two members within the group, or perhaps between the leader of the group and some external person, will bring about salvation. The group is focused entirely on the future, but as a defence against the difficulties of the present. […] The group is in fact not interested in working practically towards this future, but only in sustaining a vague sense of hope as a way out of its current difficulties. Typically, decisions are either not taken or left extremely vague. After the meeting, members are inevitably left with a sense of disappointment and failure, which is quickly superseded by a hope that the next meeting will be better.
Authority refers to the right to make an ultimate decision, and in an organization it refers to the right to make decisions which are binding on others.
There is an important difference between the terms authoritative and authoritarian. Authoritative is a depressive position state of mind (see Chapter 1) in which the persons managing authority are in touch both with the roots and sanctioning of their authority, and with their limitations. Authoritarian, by contrast, refers to a paranoid-schizoid state of mind, manifested by being cut off from roots of authority and processes of sanction, the whole being fuelled by an omnipotent inner world process. The difference is between being in touch with oneself and one’s surroundings, and being out of touch with both, attempting to deal with this unrecognized shortcoming by increased use of power to achieve one’s ends.
Drug addicts frequently live with an internal world full of chaos and uncertainty.
Drugs are often used to escape from the experience of this terrible turmoil. Reality becomes distorted, while they convince themselves that, for example, the drug has a beneficial effect on their lives, that it saves them from loneliness, despair and so on. The reality of the damage that the drug does along with the damaging life-style needed to maintain the addiction cannot be tolerated for long. Knowledge of the internal and external chaos is defended against by an assault on truth and reality, which in turn adds to the internal chaos. Those who work with drug addicts are also subjected to this assault on truth and reality; they have to make professional decisions, while living with uncertainty about what is really going on.
Some, uncharitably, dismiss Higgs’ singular success as luck. Without doubt fortune was involved here, as it is in many discoveries, but being in the right place at the right time is not enough; having the preparation to be able to act on serendipity is also important. Higgs’ story is a scientific analogue of the wisdom expounded by golfer Gary Player. After he holed a remarkable putt to win a major tournament, someone remarked, “Gary—that was lucky!” Player supposedly replied: “And the more I practice, the luckier I become!”
Most children who have no siblings learn social politics from their interaction with playmates and school friends, but Higgs’ early education in Birmingham had been as much from home as from school and, secluded from much wider merrymaking, he had to find ways of self-entertainment. His father’s bookshelves contained several texts on engineering from his student days at Bristol University. Thanks to this home library, Peter taught himself basic trigonometry, algebra, and calculus “before anyone at a school I went to taught it to me”. He attributes his dedication to mathematics as a direct result of his circumstances: “Physical health problems enabled me to forge ahead of my contemporaries, in maths especially.”
Higgs discovered too that his first physics teacher, Mr Willis, had thirty years earlier also taught Paul Dirac.
For theories of the fundamental forces, gauge invariance broadly means that the strength of the fields must be independent of the definitions of the potential. For gravity, when an object falls from a table, the speed at which it hits the floor is the same whether that table is on the ground floor or in a room at the top of a high-rise building. It is the change in the gravitational potential—the height from the tabletop to the floor—that matters, not their individual absolute elevation.
All atomic particles belong to one of two families: fermions or bosons. The names honour two scientists, Enrico Fermi and Satyendra Bose, who in the early days of quantum mechanics studied how particles behave when in large groups. Fermions are the basic seeds of matter, such as electrons or quarks, which in quantum mechanics are like cuckoos: two in the same nest are forbidden. Bosons are like penguins: large numbers cooperate as a colony. Bosons can accumulate into the lowest possible energy state—an effect known as Bose-Einstein condensation, after the two scientists whose work explains this phenomenon. This extremely low-energy state is manifested in weird phenomena, such as the superfluid ability of liquid helium to flow through narrow openings without friction; in superconductivity; and, if the six theorists were correct, Higgs bosons condense to produce a weird substance—today known as the Higgs field—that fills the universe.
Two millennia after Aristotle argued that the realization of “nothing” is untenable, the Higgs field is in effect a physical confirmation of that philosophy. According to Higgs’ theory, a truly empty vacuum devoid of all matter would be unstable. Add the Higgs field to this void, however, and it becomes stable. This may be counter-intuitive, but that is part of the theory’s magic.
Peter Higgs has managed to avoid much of the pace of modern life. In addition to having no television in his Edinburgh apartment, he does not use the internet and is not accessible by email—historically emails sent to him at Edinburgh University would be administered by departmental assistants. He has no public mobile phone contact. Other than personal visits, Higgs has been accessible only by me first leaving messages on a landline answerphone to agree on times for a conversation, or by sending letters through the post.
If the quantum equations are set up independently in these locations in different gauges, the dynamics of my electron and that in another continent or on the moon must be consistently accounted for; the results must be independent of the local choice of gauge. In 1947, in his pioneering work on re-normalizing QED, American theorist Julian Schwinger proved that for this gauge invariance to occur there must be some connection linking the various electrons and allowing us to compare the situation at the different locations. In quantum field theory, this connection consists of particles. The maths implied that the connection cares about direction—it is a vector—and the associated particle acts like a boson because of bosons’ ability to act cooperatively, in this case by building up the vector field connecting the electrically charged particles. So was born the concept of the gauge boson, which in the case of electromagnetism is the familiar photon.
The connection must be able to act over very large distances, and in quantum field theory this equates to the gauge boson having no mass. In summary, Schwinger had proved that gauge invariance implies that an electromagnetic force necessarily occurs between electrically charged bodies, and that this force is carried by a photon of zero mass. That a photon has no mass and travels through the void at nature’s speed limit is fundamental to Einstein’s special relativity theory. However, according to QED the vacuum is not empty because the photon is immersed in a sea of virtual electrons and positrons, which ensnare it, interrupting its flight. As QED implies that an electron at rest gains an infinite energy—or mass—because of these interactions, how does a photon manage to avoid a similar fate?
By carefully examining the formulae in QED theory, Schwinger concluded that gauge invariance in QED underpins this phenomenon of the massless photon. This link between gauge invariance, the existence of a force, the vanishing mass of a photon, and the ability to make QED viable thanks to re-normalization was a profound result, which in the course of time would have far-reaching implications.
But Peter Higgs found the school physics syllabus “very boring” and, appreciating the irony, admitted, “I never won a prize for physics at school”.
“Yak shaving.” Our very own Carlin Vieri invented the term, and yet it has not caught on within the lab. This is a shame, because it describes all too well what I find myself doing all too often.
You see, yak shaving is what you are doing when you’re doing some stupid, fiddly little task that bears no obvious relationship to what you’re supposed to be working on, but yet a chain of twelve causal relations links what you’re doing to the original meta-task. Here’s an example:
“I was working on my thesis and realized I needed a reference. I’d seen a post on comp.arch recently that cited a paper, so I fired up gnus. While I was searching the for the post, I came across another post whose MIME encoding screwed up my ancient version of gnus, so I stopped and downloaded the latest version of gnus.
“Unfortunately, the new version of gnus didn’t work with emacs 18, so I downloaded and built emacs 20. Of course, then I had to install updated versions of a half-dozen other packages to keep other users from hurting me. When I finally tried to use the new gnus, it kept crapping out on my old configuration. And that’s why I’m deep in the gnus info pages and my .emacs file — and yet it’s all part of working on my thesis.”
And that, my friends, is yak shaving.
While the wealthy have more control over their lives and live far more easily, such an existence that emphasizes and rewards greed corrupts their humanity, leading to antisocial attitudes and behavior. A 2012 article in the Proceedings of the National Academy of Sciences is titled: Higher Social Class Predicts Increased, Unethical Behavior.
The authors found greed propels this behavior and that “relative to lower-class individuals, individuals from upper-class backgrounds behaved more unethically in both naturalistic and laboratory settings.” One of the authors, Dacher Keltner, commented: “As you move up the class ladder, you are more likely to violate the rules of the road, to lie, to cheat, to take candy from kids, to shoplift,and to be tight-fisted in giving to others.”
I’m ready for the tools to expand to fit my ideas. I refuse to keep cutting down my ideas to fit the tools.
Any sufficiently advanced magic is indistinguishable from science.
The advantage of more reliable and greater food supplies came with disadvantages. Skeletal remains indicate that people in agricultural societies were not as well nourished as hunter-gatherers, who ate a much wider variety of foods; their height and life expectancy declined and they had to work significantly more hours per week in the fields than they ever had collecting roots, seeds, berries, and nuts. In addition to the declines in height and general nutrition and the negative impact on women with the rise of social hierarchies, there was another drawback to those societies that developed to rely exclusively on farming.
Agricultural productivity (output per farmer and per area of land) did not increase much because the technologies used remained mostly unchanged. Therefore, as populations of non-farming classes grew relative to the number of farmers and elites desired greater quantities of wealth, there was a need to expand the area under production or to conquer other peoples in order to extract greater quantities of surplus (tribute). This more socially privileged stratum of people developed into a social class when it begins to identify its own interests (in acquiring more wealth and status) as if they were the interests of the entire community. At that point, the interests of this proto-ruling elite diverge from the those of everyone else, as they attempt to force increases in productivity by those laboring directly on the land, an increase in land area (through conquest), and a larger and docile labor force (slavery, kidnapping).
As shared by Reza Baradaran Talking about the fact that we share 9,899% identical DNA with chimpanzee Everything that we are that distinguishes us from chimps, emerges from that 1% difference in DNA. It has to, because that’s the difference. The Hubble telescope, these grand, that’s in that 1%.
Maybe, everything that we are that is not the chimp, is not as smart compared to the chimp as we tell ourselves it is. Maybe the difference between constructing and launching a Hubble telescope and a chimp combining two finger motions as sign language, Maybe that difference is not all that great. We tell ourselves it is just the same way we label our books “Optical illusions”, we tell ourselves it’s a lot. Maybe it’s almost nothing.
The authors of an article in Psychological Inquiry made this observation:
Substantial evidence suggests that when the values and goals necessary for the smooth functioning of ACC American corporate capitalism become increasingly central to individuals and to institutions, the result is a corresponding conflict with three other aims: concern for the broader community and the world; close, intimate relationships; and feeling worthy and autonomous.
Web pages are ghosts: they’re like images projected onto a wall. They aren’t durable. If you turn off the projector (i.e. web server), the picture disappears. If you know how to run a projector, and you can keep it running all the time, you can have a web site.
Richard Faragher said […] just because centenarians had certain habits, it did not mean those habits were driving their longevity – an error in logic known as “survivorship bias”. […] Faragher said both theories, however, resulted in the same warning: “Never, ever take health and lifestyle tips from a centenarian.”
Richard Faragher added that many of the mooted possibilities for why centenarians live longer could actually be examples of reverse causation. For example, the idea that having a positive mental outlook can help you live for a very long time might, at least in part, be rooted in people being more sanguine because they have better health.
“When was the last time you had a really positive mental attitude and toothache?” he said.
The Tab-Hoarder is one who is reluctant to close internet tabs, usually resulting in a buildup of chaotic tab clutter on the browser toolbar.
Giving remarks to the undergoing cultural revolution It seems a though we were the gods of a world that no longer exists.
The chameleon effect refers to nonconscious mimicry of the postures, mannerisms, facial expressions, and other behaviors of one’s interaction partners, such that one’s behavior passively and unintentionally changes to match that of others in one’s current social environment.
When talking about the Stockholm Syndrome We simply mean when two or more people get together, they form a relationship – that’s all it is.
Of course, the more stress in the situation, the quicker the relationship, and the more intense it’s going to be. When people are in crisis, and they’re not sure about what’s going to happen, the one thing we’re all afraid of is going insane. I mean, we’re always worried about, are we losing our mind? Is this really happening to me? What am I doing in a thing like that? Am I experiencing this? And what we do is we want to test our feelings against another person, because if that person is sharing this experience and he’s seeing the same thing, and he ain’t going bananas and this is really happening, maybe it’s OK.
Money is the most successful story ever told, much more successful than any religious mythology. Not everybody believes in God, or in the same God, but almost everybody believes in money, even though it’s just a figment of our imagination.
People hear of fascism is this monster, and then when you hear the actual fascist story, what fascists tell you is always very beautiful and attractive. Fascists are people who come and tell you, “You are wonderful. You belong to the most wonderful group of people in the world. You’re beautiful. You are ethical. Everything you do is good. You have never done anything wrong. There are all these evil monsters out there that are out to get you, and they’re causing all the problems in the world.”
When people hear that, it’s like looking in the mirror and seeing something very beautiful. […] When you look in the fascist mirror, you never see a monster. You see the most beautiful thing in the world, and that’s the danger, […] that you see something is very beautiful, you don’t understand the monster underneath.
Instead of using the stories for our purposes, we allow the stories to use us for their purposes. Then you start entire wars because of a story. You inflict suffering on millions of people just for the sake of a story. That’s the tragedy of human history.
If you go to people and you tell them a complicated and painful story, many of them don’t want to listen. The advantage of fiction is that it can be made as simple and as painless or attractive as you want it to be because it’s fiction, and then what you see is that politicians like Hitler, they create a very simple story. We are the heroes. We always do good things. Everybody is against us. Everybody is trying to trample us, and this is very attractive.
A mondegreen (/ˈmɒndɪˌɡriːn/) is a mishearing or misinterpretation of a phrase in a way that gives it a new meaning. Mondegreens are most often created by a person listening to a poem or a song; the listener, being unable to hear a lyric clearly, substitutes words that sound similar and make some kind of sense. The American writer Sylvia Wright coined the term in 1954, recalling a childhood memory of her mother reading the Scottish ballad “The Bonnie Earl o’ Moray”, and mishearing the words “laid him on the green” as “Lady Mondegreen”.
Anonymity is when people see what you do, what you want them to see, but they can’t know who is actually behind those actions. It is similar to privacy, but in privacy, they know who you are, but they don’t know what you’re doing.
Think of privacy as insurance. You might trust your government today, you might trust your messenger with your messages, you might trust Google with your whole life, but remember, Satan was once an angel too.
Something to keep in mind when using the Tor Browser is that you shouldn’t modify anything. Leave it as is; don’t install plugins or tweak it. The Tor Browser is meant to have similar fingerprints to other Tor Browsers, and even the smallest changes might make you the most unique person on the Tor network.
“Doxing” is a neologism. It originates from a spelling alteration of the abbreviation “docs”, for “documents”, and refers to “compiling and releasing a dossier of personal information on someone”. Essentially, doxing is revealing and publicizing the records of an individual, which were previously private or difficult to obtain.
The term dox derives from the slang “dropping dox”, which, according to a contributor to Wired, Mat Honan, was “an old-school revenge tactic that emerged from hacker culture in 1990s”.
Consider GNU Objective C. NeXT initially wanted to make this front end proprietary; they proposed to release it as .o files, and let users link them with the rest of GCC, thinking this might be a way around the GPL’s requirements. But our lawyer said that this would not evade the requirements, that it was not allowed. And so they made the Objective C front end free software.
Paraphrasing Calvin Coolidge: America is not a country, America is a business.
phrenology pseudoscientific practice phrenology, the study of the conformation of the skull as indicative of mental faculties and traits of character, especially according to the hypotheses of Franz Joseph Gall (1758–1828), a German doctor, and such 19th-century adherents as Johann Kaspar Spurzheim (1776–1832) and George Combe (1788–1858). Phrenology enjoyed great popular appeal well into the 20th century but has been wholly discredited by scientific research.
Writing to “The Piano Guys” But why do I care? Why did the impulse to write this open letter not leave, even after five months? I’m really angry because, to quote the song, you give love a bad name. There’s been plenty of great talk about the power of love to crush hate, and I believe in that power wholeheartedly, but if when fellow activists distrust it, that distrust always seems to originate in the false notion that love is synonymous with pleasantness. The two have nothing to do with each other. Love is a burning force within that drives you; depending on the individual and the circumstances, it can drive you to laugh or cry, embrace or bare your teeth. It can take almost any form, but the one thing it never does is stand by passively when the beloved is harmed or threatened. It might take subtle action, it might bide its time for the opportune moment, but it does not make nice to abusers in the interest of keeping things superficially friendly, then turn it’s back on the abused.
We shape our buildings; thereafter they shape us.
The price of great love is great misery when one of you dies.
One of Elliott Jaques’ most significant contributions resulting from this project study of a manufacturing company was the recognition that social systems in the workplace function to defend workers against unconscious anxieties inherent in the work. To the extent that such defenses are unconscious, the social systems are likely to be rigid and therefore uncomfortable; but because of their role in keeping anxiety at bay, they may also be very resistant to change.
“tightwads” […] people who have trouble spending their money. […] tightwads do not scrimp because they lack money. They are not any poorer than spendthrifts (people who overspend); […] Instead, they’re afraid to spend money that they do have.
[…] Damon Young […] calls this feeling “post-brokeness stress disorder.”
[…] “There are people who have … ‘money-hoarding’ tendencies,” she told me, “where they have tens of thousands of dollars sitting in a savings account”; they fear that “something’s going to happen, and it’s all going to be taken away from them.”
Or they worry that once they start spending, they won’t be able to stop—that “one drop of the bucket turns into the faucet running constantly,”
Anarchy means “without leaders”, not “without order”. With anarchy comes an age or ordnung, of true order, which is to say voluntary order… this age of ordnung will begin when the mad and incoherent cycle of verwirrung that these bulletins reveal has run its course… This is not anarchy, Eve. This is chaos.
Because while the truncheon may be used in lieu of conversation, words will always retain their power. Words offer the means to meaning, and for those who will listen, the enunciation of truth.
Introduce a little anarchy. Upset the established order, and everything becomes chaos.
The smallest natural point is larger than all mathematical points, and this is proved because the natural point has continuity, and any thing that is continuous is infinitely divisible; but the mathematical point is indivisible because it has no size.
The painter who draws merely by practice and by eye, without any reason, is like a mirror which copies every thing placed in front of it without being conscious of their existence.
Continuing on his point that technological change is ecological change… the consequences of technological change are always vast, often unpredictable and largely irreversible. That is also why we must be suspicious of capitalists. Capitalists are by definition not only personal risk takers but, more to the point, cultural risk takers. The most creative and daring of them hope to exploit new technologies to the fullest, and do not much care what traditions are overthrown in the process or whether or not a culture is prepared to function without such traditions.
Technological change is not additive; it is ecological. […] What happens if we place a drop of red dye into a beaker of clear water? Do we have clear water plus a spot of red dye? Obviously not. We have a new coloration to every molecule of water. That is what I mean by ecological change. A new medium does not add something; it changes everything. […] after the printing press was invented, you did not have old Europe plus the printing press. You had a different Europe.
[…] media tend to become mythic. I use this word in the sense in which it was used by the French literary critic, Roland Barthes. He used the word “myth” to refer to a common tendency to think of our technological creations as if they were God-given, as if they were a part of the natural order of things. […] Cars, planes, TV, movies, newspapers—they have achieved mythic status because they are perceived as gifts of nature, not as artifacts produced in a specific political and historical context.
When a technology becomes mythic, it is always dangerous because it is then accepted as it is, and is therefore not easily susceptible to modification or control. […] Whenever I think about the capacity of technology to become mythic, I call to mind the remark made by Pope John Paul II. He said, “Science can purify religion from error and superstition. Religion can purify science from idolatry and false absolutes.”
[…] every technology has a prejudice. Like language itself, it predisposes us to favor and value certain perspectives and accomplishments. In a culture without writing, human memory is of the greatest importance, as are the proverbs, sayings and songs which contain the accumulated oral wisdom of centuries. That is why Solomon was thought to be the wisest of men. In Kings I we are told he knew 3,000 proverbs. But in a culture with writing, such feats of memory are considered a waste of time, and proverbs are merely irrelevant fancies. The writing person favors logical organization and systematic analysis, not proverbs. The telegraphic person values speed, not introspection. The television person values immediacy, not history. And computer people, what shall we say of them? Perhaps we can say that the computer person values information, not knowledge, certainly not wisdom. Indeed, in the computer age, the concept of wisdom may vanish altogether.
The third idea, then, is that every technology has a philosophy which is given expression in how the technology makes people use their minds, in what it makes us do with our bodies, in how it codifies the world, in which of our senses it amplifies, in which of our emotional and intellectual tendencies it disregards. This idea is the sum and substance of what the great Catholic prophet, Marshall McLuhan meant when he coined the famous sentence, “The medium is the message.”
[…] we have been willing to shape our lives to fit the requirements of technology, not the requirements of culture. This is a form of stupidity, especially in an age of vast technological change. We need to proceed with our eyes wide open so that we many use technology rather than be used by it.
Programming languages teach you not to want what they cannot provide. You have to think in a language to write programs in it, and it’s hard to want something you can’t describe.
If languages were just about writing programs, we could’ve stopped with C. (And some have.) But computers and languages are interesting specifically because they’re malleable. (That is changing.) The more we expect out of programs, the more vital it is to explore new ways of making programs.
“term inflation” — the stretching of a good idea to promote values and goals quite different from what is meant.
Virtual networks are not only insufficient replacements for communities, but their proliferation makes the establishment of communities more difficult.
Unsupervised, child-directed play was in decline long before kids had smartphones … and more affluent children had many of their activities organized for them by their parents, putting them in a variety of highly structured functional groups with different kids rather than repeatedly playing freely with their neighbors. This oversupervision or “coddling” Greg Lukianoff and Jon, 2028 made the attractions of smartphones and social media even more appealing.
A community requires a commitment to a certain social order—and usually to a place—that, by definition, must constrain some choices. In return for security, support, and belonging, members surrender some of their freedom. This explains why creating community in America today is so difficult—few want to compromise their ability to make choices.
Anthropomorphism is the attribution of human traits, emotions, or intentions to non-human entities. […] Personification is the related attribution of human form and characteristics to abstract concepts such as nations, emotions, and natural forces, such as seasons and weather.
[…] Anthropomorphism and anthropomorphization derive from the verb form anthropomorphize, itself derived from the Greek ánthrōpos (ἄνθρωπος, lit. “human”) and morphē (μορφή, “form”). It is first attested in 1753, originally in reference to the heresy of applying a human form to the Christian God.
[…] in other models, small amounts of synthetic data can poison an image model, and once distorted, it is difficult for such models to recover even after being trained on true data Bohacek and Farid 2023. As one particular example, training LLMs on synthetic data can lead to diminishing lexical, semantic and syntactic diversity Guo et al. 2023.
Researchers have noted that the recursive training of AI models on synthetic text may lead to degeneration, known as “model collapse”.
[…] Over time, dependence on these systems, and the existence of multifaceted interactions among them, may create a “curse of recursion” Shumailov et al. 2023, in which our access to the original diversity of human knowledge is increasingly mediated by a partial and increasingly narrow subset of views. With increasing integration of LLM-based systems, certain popular sources or beliefs which were common in the training data may come to be reinforced in the public mindset (and within the training data), while other “long-tail” ideas are neglected and eventually forgotten.
When women audition for symphony orchestras, they are more likely to be accepted if the selection committee is unaware of their gender. Women software developers are considered better coders than their male counterparts if “their peers didn’t realize the code had been written by a woman.” And students judge a professor teaching an online course more favorably if they think that she is a he.
What we should really be worried about is the system of capitalism, not robots or our humanity. What has been shown to give people true happiness—control over our own lives and creative and socially meaningful work, participation in community activities, and time with friends and family—has decreased through longer hours at work, less control over the workplace environment, always-on connectivity, more job insecurity, longer commutes to and from work, and an unachievable quest for satisfaction through ownership of consumer goods.
In the 1960s, the U.S. government stole hundreds of thousands of acres from the Standing Rock Sioux Reservation in order to build dams in the Missouri River watershed. In the process, “hundreds of Indian families from various tribes were forcibly relocated and their way of life completely destroyed.”
Discussing countering Soviet influence in the Middle East:
The President Eisenhower said he thought we should do everything possible to stress the ‘holy war’ aspect.
Under capitalism, efforts to ameliorate or reverse the damage caused by ecological rifts and disturbances all have a common element: the underlying cause of the problem cannot be questioned.
To Wall Street, which sorts risk by its ability to maximize profit, “adaptation” to climate change simply means learning how to profit from it. And there are a variety of ways of doing so—for example, rebuilding after flooding offers a boost for the construction industry as does the building of coastal defenses against rising seas, already occurring along the East Coast of the United States. Miami is spending $400 million for pumps and other infrastructure to prevent flooding. Hoboken, New Jersey, received a $230 million federal grant to shore up protections, and Norfolk, Virginia, received $100 million from the federal government to carry out a plan to protect neighborhoods from flooding. All of this spending adds to the measured national economic growth (GDP) and thus is seen as positive—even when the reason is negative.
[…] One of the growing areas of financial speculation is the issuance and trading of so-called catastrophe bonds, referred to as cat bonds, offering insurance against catastrophic weather events as well as other types of disasters such as wildfires. The $72 billion in cat bonds in 2016 is expected to double in the next few years.
Inherent in the blame-the-victim ideology is the myth of equal opportunity for everyone. If you work hard enough and make the right choices, you will succeed. But a child conceived and born into poverty, with its associated stresses and limitations, does not have the same opportunities as a child born into wealth. The continuing belief in the falsehood of equal opportunity and the equally false notion of easy upward mobility, as well as the widespread acceptance of racist and sexist ideas, help to explain many people’s acquiescence to gross inequalities in society and continuing discrimination. This ideology sanctifies wealth and greed as reward for good behavior and good decisions and helps to explain, as W. E. B. Du Bois wrote, “the fact that so many civilized persons are willing to live in comfort even if the price of this is poverty, ignorance, and disease of the majority of their fellowmen.”
In a study across sixty-three countries, researchers estimated that approximately 20 percent of suicides were the result of unemployment.
The source of our alienation is the lack of control in the workplace, and the unequal opportunities it affords for financial well-being as well as quality of life. Consumerism is a form of emotional compensation we use in a futile attempt to overcome alienation. Recognizing this can help us begin to constructively deal with our alienation.
Advanced neurological and genetic research … has shown that animals like chimpanzees, orcas and elephants possess self-awareness, self-determination and a sense of both the past and future. They have their own distinct languages, complex social interactions and tool use. They grieve and empathize and pass knowledge from one generation to the next. The very same attributes, in other words, that we once believed distinguished us from other animals.
Relying on unpaid work done primarily by women instead of provided by social programs, means lower taxes on the wealthy and more profits for capitalists.
In our society, however, recycling serves an important ideological function by convincing people that they’re doing something positive for the environment while obscuring the question of why so many products are purposely designed for single use or ready disposal. This perspective puts the responsibility of waste on the individual, not on the company or the system as a whole.
Heede’s research shows that nearly two-thirds of anthropogenic carbon emissions originated in just 90 companies and government-run industries. Among them, the top eight companies—ranked according to annual and cumulative emissions—account for 20 percent of world carbon emissions from fossil fuels and cement production since the Industrial Revolution. So why not just stop these companies from operating and promote the building of a clean energy system? There is no evidence that governments will discipline giant corporations and shift the global economy away from fossil fuels. To do so would require drastic downsizing or liquidation of many of the largest corporations on the planet. It would leave Wall Street in tatters. It would also require a huge investment to build a replacement consisting of renewable energy infrastructure. While costs for solar (PV) and wind energy installations have dropped drastically, and in some cases are cost competitive with new fossil fuel installations, it is wishful thinking of the most utopian kind to expect that capitalists and governments will allow public financing for an effort to replace existing electric power facilities. It is only when the elite feel a direct and immediate threat to their system of capital accumulation, such as a major war or civil insurrection, that they are willing to commit the vast amount of financial resources necessary and agree to have production directed by government.
Neoliberalism has achieved an incredible stranglehold on our thinking in recent decades. Even people who genuinely care about the environment have started to believe that market-based solutions like pollution offsets and carbon trading offer a better solution than government regulation and enforcement.
Nature is a hard word to define. Originating in Latin as natura, meaning “birth,” today people give the word a wide variety of meanings. Colloquially, it’s used as a blanket term applied to everything that isn’t human or constructed by humans. In everyday speech, people refer to animals as if humans were in an altogether different category (aside from species), instead of being fellow animals. […] Nature in Western societies is commonly viewed as a place that we travel to visit rather than inhabit on a daily basis.
The range of choices available to buyers is determined not by what is environmentally friendly, but by what can be sold profitably. As a result, we get micro-choices such as Ford vs Hyundai—but not real choices such as automobiles vs reliable and affordable public transit.
Let us not […] flatter ourselves overmuch on account of our human victories over nature. For each such victory nature takes its revenge on us. Each victory, it is true, in the first place brings about the results we expected, but in the second and third places it has quite different, unforeseen effects which only too often cancel out the first […]. At every step we are reminded that we by no means rule over nature like a conqueror over a foreign people, like someone standing outside nature—but that we, with flesh, blood and brain, belong to nature, and exist in its midst, and that all our mastery of it consists in the fact that we have the advantage over all other creatures of being able to learn its laws and apply them correctly.
By making individual persons the solution to waste, we become the very thing capitalists want: consumers. Conscious and concerned or otherwise, it doesn’t matter. Efforts to change consumption habits instead of production will not solve the problem. As Samantha McBride writes: What we have for producers is freedom: freedom to be green or semi-green or not green, freedom to do what is in their best interests, without strife or inconvenience. What we have for citizens are (1) a definition of their scope of political action as not just personal behavior, but the purchase transaction, and (2) an utter lack of knowledge needed to reenter the realm of the political and advocate for regulatory change.112 Stressing individual responsibility of ordinary people leads us to ignore the waste associated with production. It also ignores the waste associated with consumption by the very wealthy, the military-industrial complex, the vast incarceration system, and the advertising industry. It leads us to disregard that companies will always be striving to sell more products year after year.
If we humans are a part of nature, then to even talk of a “pristine” environment at all is misrepresentative and ideological.
Gender violence is more of a threat to women’s health than the sum of all traffic accidents and malaria.
It is that exploited, alienated and relatively powerless period, the working day, which reduces … people to settle for commodity satisfaction in their “free time.” The bargain […] extends its influence throughout all levels and institutions, marking out the shape of the “consumer society.” It is this society which threatens the environment with its unlimited appetite—unlimited precisely because its objects are so unsatisfying.
In Moscow, there is a population of feral dogs who know how to use the subway.
The quintessentially human-created environment of the city, with acres of asphalt, canyons of steel, brick, and concrete, thousands of car engines, the screaming sirens and honking horns, and millions of humans, is also natural, constructed by a natural species. Cities are also places where wildlife exists, and are ecosystems in their own right.
Capitalists cannot take into account the consequences of their actions (the “externalities”) in their pursuit of profits. As long as there is no interference with the accumulation of capital, emitted pollutants (and how they behave according to scientific laws) are viewed by capital as irrelevant to the operation of companies. Actually, they are not considered at all unless strong government regulations exist and are enforced.
Capitalism: A system that does not meet the basic needs of the mass of humanity, a system that continues to harm so many people, a system in which countries use force to promote their perceived geopolitical and economic interests, unable to stop itself from destroying the biosphere it depends on, is a system that desperately needs to be superseded by a different system: one that has completely different goals, logic, and ways of operating, a society based on substantive human equality and that regenerates and then maintains a healthy ecosystem.
Our genus (Homo) had many branches, including us (Homo sapiens) and a number of species now extinct, such as Homo neanderthalensis, Homo erectus, Homo habilis, Homo floresiensis, and the latest member, Homo naledi, whose remains were found in a cave in South Africa in 2013 and first described in 2015.
As soon as you monetize something in nature, nature always loses.
Alienation from our own labor is not only or even primarily a psychological condition, but a feature integral to capitalist social relations, one that profoundly damages us. The repetitive, often pointless monotony of most work, the jobs in which people spend a large part of their waking hours, “is a […] stifling of one’s urge to self-fulfillment in the most important segment of one’s life, the spending of vital energies at tasks one would never dream of freely choosing.”
The story behind Lisp is fun (you can read John McCarthy’s account in the first History of Programming Languages). One of the motivations was that he wanted something like “Mathematical Physics” — he called it a “Mathematical Theory of Computation”. Another was that he needed a very general kind of language to make a user interface AI — called “The Advice Taker” — that he had thought up in the late 50s.
He could program — most programs were then in machine code, Fortran existed, and there was a language that had linked lists.
John made something that could do what any programming language could do (relatively easy), but did it in such a way so that it could express the essence of what it was about (this was the math part or the meta part or the modern Maxwell’s Equations part, however you might like to think of it). He partly did this — he says — to show that this way to do things was “neater than a Turing Machine”.
Scientists after Newton were qualitatively more able than before, etc. My slogan for this is “Point of view is worth 80 IQ points”.
[…] One of our many problems with thinking is “cognitive load”: the number of things we can pay attention to at once. The cliche is 7±2, but for many things it is even less. We make progress by making those few things be more powerful.
This is one of the reasons mathematicians like compact notation. The downside is the extra layers of abstraction and new cryptic things to learn—this is the practice part of violin playing—but once you can do this, what you can think about at once has been vastly magnified.
[…] contrary to the myth that the original colonial invaders of the Americas found only small wandering bands of Neolithic hunter-gatherers and mysteriously collapsed empires, the Western Hemisphere was densely populated. An estimated 100 million people lived in the Americas at the time of the European invasion, existing in a range of sophisticated and highly developed societies.
[…] undecided if i’m for or against it, in research the line between love and hate is thin.
telling to my father in your life, earn that much which makes you not think about money.
Although he attributes this quote to Vladimir Horowitz, the quote has diverged so much that is now his own
Playing fast is in playing slow.
[…] programming languages are not just technology, but what programmers think in. They’re half technology and half religion.
One must practice slowly, then more slowly, and finally slowly.
In computer programming, homoiconicity (from the Greek words homo- meaning “the same” and icon meaning “representation”) is a property of some programming languages. A language is homoiconic if a program written in it can be manipulated as data using the language. The program’s internal representation can thus be inferred just by reading the program itself. This property is often summarized by saying that the language treats code as data.
The object-oriented nature of Smalltalk was very suggestive. For example, object-oriented means that the object knows what in can do. In the abstract symbolic arena, it means we should first write the object’s name (or whatever wil fetch) it and then follow with a message it can understand that asks it to do something. In the concrete user-interface arena, it suggests that we should select the object first. It can then furnish us with a menu of what it is willing to do. In both cases we have the object first and the desire second. This unifies the concrete with the abstract in a highly satisfying way.
The contrastive ideas of Bruner suggested that there should always be away to compare. The flitting-about nature of the iconic mentality suggested that having as many resources showing on the screen as possible would be a good way to encourage creativity and problem solving and prevent blockage. An intuitive way to use the windows was to activate the window that hte mouse was in and bring it ot the “top.” This interaction was modeless in a special sense of the word. The active window constituted a mode to be sure—–one window might hold a painting kit, another might hold text—but one could get to the next window to do something in without any special termination. This is what modeless came to mean for me—the user could always get ot the next thing desired without any backing out. The contrast of the nice modeless interactions of windows with the clumsy command syntax of most previous systems directly suggested that everything should be made modeless. Thus began a campaign to “get rid of modes.”
My main complaint is that metaphor is a poor metaphor for what needs ot be done. At PARC we coined the phrase user illusion to describe what we were about when designing user interface. There are clear connotations to the stage, theatrics, and magic—all of which give much stronger hints as to the direction to be followed. For example, the screen as “paper to be marked on” is a metaphor that suggests pencils, brushes, and typewriting. Fine, as far as it goes. But it is the magic—understandable magic—that really counts. Should we transfer the paper metaphor so perfectly that the screen is as hard as paper to erase and change? Clearly not. If it is to be like magical paper, then it is the magical part that is all important and that must be most strongly attended to in the user interface design.
While the magic is being designed, the very idea of a paper “metaphor” should be scrutinized mercilessly. One of the most wonderful properties of a computer is that no matter how many dimensions one’s information has, a computer representation can always supply at least one more. One result is that any seeming distance between items in our world of limited dimension can be completely “disappeared.”
This Is something that Vannevar Bush and his chief prophet Doug Englebart noticed immediately, and hypermedia was born. In a world of Dynabooks, information will not be printed—it would destroy most of the useful associations—and something much more than superpaper will emerge. The notion of hypermedia is much more a “user illusion” than a “metaphor.”
It is not surprising, either, that many people who are “figurative” have extreme difficulty getting anything finished—there is always something new and interesting that pops up to distract. Conversely, the “symbolic” person is good at getting things done, because of the long focus on single contexts, but has a hard time being creative, or even being a good problem solver because of the extreme tendency to get blocked.
One of my longstanding pet hates is to have them behave anything like their physical counterparts. For example, as they existed in Officetalk, Star, Lisa, and Mac—like real folders—there is only one icon for a document application and it can be in only one folder. This drives me crazy, because the probability of not finding what you are looking for by browsing has just been maximized! It is trivial to have as many icon instances for a given doc or app in many folders as one wishes. They should be near any place where they might be useful. (Dragging a a singleton out on the desktop is not a solution to this problem!) But even if that were fixed we have to ask: why a folder? Instead of passive containers, why not have active retrievers that are constantly trying to capture icon instances that are relevant to them? Let’s call them bins. Imagine having a “memos” bin that, whenever and wherever you make up a memo, captures a copy of the doc icon. You might have a “memos ot boss” bin that automatically captures only those icon of docs sent to your boss. Folders kill browsing. Bins extend their useful range.
I read McLuhan’s Understanding Media and understood that the most important thing about any communications medium is that message receipt is really message recovery; anyone who wishes to receive a message embedded in a medium must first have internalized the medium so it can be “subtracted” out ot leave the message behind. When he said “the medium is the message” he meant that you have to become the medium if you use it.
In the water-pouring experiment, after the child asserted there was mere water in the tal thin glass, Jerome Bruner covered it up with a card and asked again. This time the child said, “There must bethe same because where would the water go?” When Bruner took away the card to again reveal the tal thin glass, the child immediately changed back to saying there was more water.
When the cardboard was again interposed the child changed yet again. It was as though one set of processes was doing the reasoning when the child could see the situation, and another set was employed when the child could not see. Bruner’s interpretation of experiments like these is one of the most important foundations for human-related design. Our mentalium seems to be made up of multiple separate mentalities with very different characteristics. They reason differently, have different skills, and often are in conflict.
Bruner identified a separate mentality with each of Piaget’s stages: He called them enactive, iconic, symbolic. While not ignoring the existence of other mentalities, he concentrated on these three to come up with what are still some of the strongest ideas for creating learning-rich environments.
The ability to “read” a medium means you can access materials and tools created by others. The ability to “write” in a medium means you can generate materials and tools for others. You must have both to be literate. In print writing, the tools you generate are rhetorical; they demonstrate and convince. In computer writing, the tools you generate are processes; they simulate and decide.
If we agree with the evidence that the human cognitive facilities are made up of a doing mentality, an image mentality, and a symbolic mentality, then any user interface we construct should at least cater to the mechanisms that seem to be there. But how? One approach is to realize that no single mentality offers a complete answer to the entire range of thinking and problem solving. User interface design should integrate them at least as wel as Bruner did in his spiral curriculum ideas.
The work of Papert convinced me that whatever user interface design might be, it was solidly intertwined with learning. Bruner convinced me that learning takes place best environmentally and roughly in stage order— it is best to learn something kinesthetically, then iconically, and finally the intuitive knowledge will be in place that wil allow the more powerful but less vivid symbolic processes to work at their strongest.
One of the implications of the Piaget-Bruner decomposition is that the mentalities originated at very different evolutionary mit es and there is little probability that they can intercommunicate and synergies in more than the most rudimentary fashion. In fact, the mentalities are more likely to interfere with each other as they compete for control. The study by Hadamard on math and science creativity and others on music and the arts indicate strongly that creativity in these ears is not at all linked to the symbolic mentality as (most theories of teaching suppose), but that the important work in creative areas is done in the initial two mentalities—most in the iconic (or figurative) and quite a bit in the enactive.
McLuhan’s claim that the printing press was the dominant force that transformed the hermeneutic Middle Ages into our scientific society should not be taken too lightly—especially because the main point is that the press didn’t do it just by making books more available, it did it by changing the thought patterns of those who learned to read.
Though much of what McLuhan wrote was obscure and arguable, the sum total to me was a shock that reverberates even now. The computer is a medium! I had always thought of it as a tool, perhaps a vehicle—a much weaker conception. What McLuhan was saying is that if the personal computer is a truly new medium then the very use of it would actually change the thought patterns of an entire civilization.
Originally in Persian: When my uncle Shahab died, my father told me sitting on the stairs It’s alright, we are fine, I lost my brother when I was 15 and I survived, you’ll be fine too.
I think of life as taking a train. We each have our own destinations, each one of us get into the train at one station, and leave at another, life is like that, all we can do is enjoy the moments we have in parts of our journeys that we share together.
Everyone seems to want user interface but they are not sure whether they should order it by the yard or by the ton.
Every single joke in this paper is pure gold, but just for the sake of remembering the golds here are some of them…
[…] 1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn’t have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.
[…] 1959 - After losing a bet with L. Ron Hubbard, Grace Hopper and several other sadists invent the Capitalization Of Boilerplate Oriented Language (COBOL) . Years later, in a misguided and sexist retaliation against Adm. Hopper’s COBOL work, Ruby conferences frequently feature misogynistic material.
[…] 1983 - In honor of Ada Lovelace’s ability to create programs that never ran, Jean Ichbiah and the US Department of Defense create the Ada programming language. In spite of the lack of evidence that any significant Ada program is ever completed historians believe Ada to be a successful public works project that keeps several thousand roving defense contractors out of gangs.
[…] 1986 - Brad Cox and Tom Love create Objective-C, announcing “this language has all the memory safety of C combined with all the blazing speed of Smalltalk.” Modern historians suspect the two were dyslexic.
[…] 1990 - A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that “a monad is a monoid in the category of endofunctors, what’s the problem?”
[…] 1995 - At a neighborhood Italian restaurant Rasmus Lerdorf realizes that his plate of spaghetti is an excellent model for understanding the World Wide Web and that web applications should mimic their medium. On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.
It’s like when you don’t know someone they are more interesting, they can be anyone you like them to be.
In my experience most of the community in Elm is made up of people who don’t really know what type classes can give you, for example, but they’ll happily argue that it’s too advanced or not needed. Most of that comes from parroting the popular in-community opinion instead of informing themselves.
This kind of inbred opinion is not unique to Elm: you can find it in Elixir, Clojure and pretty much every other community that relies too much on the benevolent dictator or the prominent founder/inventor paradigm.
[…] I also find it interesting that a lot of these languages that rely on this paradigm have leaders that constantly complain that it’s hard to run this kind of community. The reason it’s so hard is because they’ve made themselves a benevolent dictator and they keep that status quo because presumably they like that they can sort of control opinion in the community that way as well.
That’s why we study history. So we’ll stop killing each other.
In designing SELF, we have been led to some rather strange recurring themes. We present them here for the reader to ponder.
Behaviorism: In many object languages, objects are passive; an object is what it is. In SELF, an object is what it does. Since variable access is the same as message passing, ordinary passive objects can be regarded merely as methods that always return themselves. For example, consider the number 17. In Smalltalk, the number 17 represents a particular (immutable) state. In SELF, the number 17 is just an object that returns itself and behaves a certain way with respect to arithmetic. The only way to know an object is by its actions.
Computation viewed as refinement: In Smalltalk, the number 17 is a number with some particular state, and the state information is used by the arithmetic primitives—addition, for example. In SELF, 17 can be viewed as a refinement of shared behavior for numbers that responds to addition by returning 17 more than its argument. Since in SELF, an activation record’s parent is set to the receiver of the message, method activation can be viewed as the creation of a short-lived refinement of the receiver. Likewise, block (closure) activation can be viewed as the creation of a refinement of the activation record for the enclosing context scope.
Parents viewed as shared parts: Finally, one can view the parents of an object as shared parts of the object. From this perspective, a SELF point contains a private part with x and y slots, a part shared with other points containing +, -, etc. slots, and a part shared with all other objects containing behavior common to all objects. Viewing parents as shared parts broadens the applicability of inheritance.
Paul: The classical market is a lot like the wine market. It survives by creating the illusion of meaningful diversity. There will always be the demand for the opportunity to distinguish oneself by pretending to see a world where others see nothing.
“Sometimes there really is a world” Helen says Well, the fact that you perceive a difference doesn’t necessarily make that difference meaningful.
Henry Cole: There is a lot to be said, for staying at the surface of things.
Helen Morrison: Some pianists are cerebral, others sentimental. Henry Cole is an existential pianist. He plays with his whole life.
Of course, being existential about anything, can be complicated.
Helen Morrison: It’s been said that over the music of Beethoven is spread the twilight of eternal loss and eternal hope. The same goes for life, I suppose. Except for the eternal part.
German composers are good company.
It’s hard to explain how I felt as Henry Cole played that night. What does great music feel like? Like a form of knowledge, maybe, or even wisdom? But it isn’t wisdom of course, nor anything else I can put into words.
The best I can do is to say that it’s somehow about what it feels like to be alive. That music was filled with grief and longing and dogged resolve. And as I listened, I felt suddenly richer, more compassionate. And I wanted to share the moment with the whole world.
I guess the word I’m looking for is gratitude. Gratitude for Schumann, Bach, Beethoven. Gratitude for Henry Cole and all those who celebrate the music of life.
Helen Morrison: Nietzsche famously said that without music life would be a mistake.
German philosophers tend to exaggerate.
But he did have a point. I know that without music my own life would’ve been incomplete in some fundamental way. Like if I’d had no friends or no memories.
I even tried to be a pianist for a while until I realized just how fragile piano playing really is.
Meetings at Amazon and Blue Origin are unusual. When new people come in, like a new executive joins, they’re a little taken aback sometimes because the typical meeting, we’ll start with a six-page narratively structured memo and we do study hall. For 30 minutes, we sit there silently together in the meeting and read. […] Take notes in the margins. And then we discuss. And the reason, by the way, we do study, you could say, I would like everybody to read these memos in advance, but the problem is people don’t have time to do that. And they end up coming to the meeting having only skimmed the memo or maybe not read it at all, and they’re trying to catch up. And they’re also bluffing like they were in college having pretended to do the reading.
[…] But one of the problems is PowerPoint is really designed to persuade. It’s kind of a sales tool. And internally, the last thing you want to do is sell. Again, you’re truth seeking. You’re trying to find truth. And the other problem with PowerPoint is it’s easy for the author and hard for the audience. And a memo is the opposite. It’s hard to write a six-page memo. A good six-page memo might take two weeks to write. You have to write it, you have to rewrite it, you have to edit it, you have to talk to people about it. They have to poke holes in it for you. You write it again, it might take two weeks. So the author, it’s really a very difficult job, but for the audience it’s much better.
Can you explain this day one thing? Fridman asks It’s really a very simple, and I think age-old idea about renewal and rebirth and every day is day one. Every day you are deciding what you’re going to do and you are not trapped by what you were or who you were or any self-consistency. Self-consistency even can be a trap. And so day one thinking is we start fresh every day […]
Victor talking about Steve Jobs copying GUI from Smalltalk… The original concept of object-oriented programming was entirely graphical — objects on the screen represented objects in the program. That was the deep idea. But in the Macintosh, the GUI didn’t allow for programming the computer at all. Jobs made a shallow clone of Smalltalk’s deep ideas in order to get the first Macintosh onto the market.
What we consider unknown or mysterious continues to shrink, even if the scale of the cosmos means that the shrinkage takes the form of ∞ - x, where ∞ is the vast unknown and x is all of human knowledge.
[…] Victor wants to experience the revolutionary ∞. He seems to be on a spiritual quest, seeking an insight that Alan Kay calls “a kerpow.” An opening into a new dimension. Because deeper than any deep idea, somewhere beyond the land of ideas entirely, there is a rich and boundless terrain that has never been mapped, though we have tried for thousands of years.
Victor’s dream is to be able to experience an entire scientific paper — or the entire global supply chain — in a computationally-driven room. To explore the data with more richness and depth than would be possible on a single screen. And on the most existential level, his hope is that the research might help to avert human extinction. If we can understand the complexity of our world more broadly, we’ll be in a better long-term position to mitigate risks that may threaten civilization.
One element that encourages community: Dynamicland is the only place you can go to work on Realtalk projects. I can’t work on projects alone at home. There’s no GitHub for Realtalk. My time at Dynamicland feels precious, and that preciousness seems to elevate my creativity. Maybe this is what it felt like to have to schedule a block of time at a university computer before the PC era.
Bret Victor says he has tried to treat Dynamicland like a biological containment facility. He’s concerned that there’s been so much damage done by half-baked ideas stolen from research labs by entrepreneurs. “The thing about taking a deep idea and making a mass-produced, superficial treatment of it, is that after that point it becomes impossible to see the deep idea,” he says.
Einstein’s genius was his ability to use physical intuition and imagination to solve problems, but in his later years, he increasingly relied on abstract mathematics. This shift from physical intuition to pure mathematics may have hindered his progress in finding a unified field theory
As quoted by Zea People have a hard time letting go of their suffering. Out of a fear of the unknown, they prefer suffering that is familiar.
A lot of what we use today is extended from our analog past: email, digital books, and digital photos are more or less direct carryovers from physical letters, books, and photos. And this tendency has bled into hardware products: We’ve extended $0.05 pencils and $0.005 paper by creating $200 digital pencils and $1,000 digital tablets. By carrying forward some of the elegance of pencil and paper into the digital realm, we cheat ourselves out of discovering entirely new approaches.
Many people have suggested ASCII text files might not be the best way to present source code. On the other hand, plain text has a lot of benefits, not least of which is the huge ecosystem of tools built around consuming and producing it.
Translated from Persian If you wish to make someone like you, ask them for small favours. People love to do things for each other and it makes them like you a lot. This is something I just thought of!
I think rich people are worse at community because you get used to getting everything your own way […] In community living, you have to give up your preferences.
The embrace of living near friends isn’t just about real estate. It represents a reframing of what success looks like in America, where upward mobility has always flowed toward more privacy. The more money you have, the more you can afford to shield yourself from the messiness that arises from sharing space with other people — whether through living alone or hiring support like housekeepers or nannies. But as Kristen Berman put it, people may have begun to understand that they are often using their money to “buy more loneliness.”
With every person that you lose, a part of you doupches.
Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, “memex” will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.
The three outstanding problems in physics, in a certain sense, were never worked on while I was at Bell Labs. By important I mean guaranteed a Nobel Prize and any sum of money you want to mention. We didn’t work on (1) time travel, (2) teleportation, and (3) antigravity. They are not important problems because we do not have an attack. It’s not the consequence that makes a problem important, it is that you have a reasonable attack. That is what makes a problem important.
Most of the time the audience wants a broad general talk and wants much more survey and background than the speaker is willing to give. As a result, many talks are ineffective. The speaker names a topic and suddenly plunges into the details he’s solved. Few people in the audience may follow.
You should paint a general picture to say why it’s important, and then slowly give a sketch of what was done. Then a larger number of people will say, “Yes, Joe has done that,” or “Mary has done that; I really see where it is; yes, Mary really gave a good talk; I understand what Mary has done.” The tendency is to give a highly restricted, safe talk; this is usually ineffective.
Courage is one of the things that Shannon had supremely. You have only to think of his major theorem. He wants to create a method of coding, but he doesn’t know what to do so he makes a random code. Then he is stuck. And then he asks the impossible question, “What would the average random code do?” He then proves that the average code is arbitrarily good, and that therefore there must be at least one good code. Who but a man of infinite courage could have dared to think those thoughts?
I saw Walter Houser Brattain when he got a Nobel Prize. The day the prize was announced we all assembled in Arnold Auditorium; all three winners got up and made speeches. The third one, Brattain, practically with tears in his eyes, said, “I know about this Nobel-Prize effect and I am not going to let it affect me; I am going to remain good old Walter Brattain.” Well I said to myself, “That is nice”. But in a few weeks I saw it was affecting him. Now he could only work on great problems.
The great scientists, when an opportunity opens up, get after it and they pursue it. They drop all other things. They get rid of other things and they get after an idea because they had already thought the thing through.
And you’re aware your dreams are, to a fair extent, a reworking of the experiences of the day. If you are deeply immersed and committed to a topic, day after day after day, your subconscious has nothing to do but work on your problem. And so you wake up one morning, or on some afternoon, and there’s the answer.
With a bit of modification Per who works with the door open gets all kinds of interruptions, but per also occasionally gets clues as to what the world is and what might be important.
One of the characteristics you see, and many people have it including great scientists, is that usually when they were young they had independent thoughts and had the courage to pursue them. For example, Einstein, somewhere around 12 or 14, asked himself the question, “What would a light wave look like if I went with the velocity of light to look at it?”
Great scientists tolerate ambiguity very well. They believe the theory enough to go ahead; they doubt it enough to notice the errors and faults so they can step forward and create the new replacement theory. If you believe too much you’ll never notice the flaws; if you doubt too much you won’t get started.
At first, everything was moving very slowly, but despite the slow progress, I kept going. Now I see that slow progress doesn’t necessarily mean bad progress. In fact, sometimes it’s good and gives me hope because it makes me see that everything is achievable and that any task is possible, even when it seems incomprehensible and daunting at first.
[…] I concluded that although starting something might be hard, continuing it shows you that nothing in the world is truly difficult, and you can enjoy starting anything.
I find that the major objection is that people think great science is done by luck. It’s all a matter of luck. Well, consider Einstein. Note how many different things he did that were good. Was it all luck? Wasn’t it a little too repetitive? Consider Shannon. He didn’t do just information theory. Several years before, he did some other good things and some which are still locked up in the security of cryptography. He did many good things.
Not long after this was written, I started working on Flux Garden, “The GarageBand of Scientific Computing” […]
Maybe the most valuable result of the project was coming to recognize that a commercial app was too narrow, in many ways, for what I wanted to accomplish. One way was that the computer screen was too confining.
I envisioned living in a world which explained itself through mathematical models, where mathematical modeling and exploration was something that people did together, casually and constantly, with ordinary materials in the real world. I wanted to “do math with my hands”.
[…] people in the newspaper industry saw the web as a newspaper. People in TV saw the web as TV, and people in book publishing saw it as a weird kind of potential book. But the web is not just some kind of magic all-absorbing meta-medium. It’s its own thing.
One can spend a lot of time defining a medium in terms of how it looks, what it transmits, wavelengths used, typographic choices made, bandwidth available. I like to think about media in terms of questions answered.
Everything powered by ambition comes with compromise and taint, and is made under ridiculous circumstances. Everything good is transmuted from grudge-fueled self-doubt into something that other people love and criticize, knowing they could do better if given the time and resources.
Books are not product. Books are creative endeavors as individual and singular as any work of art. They cannot be tweaked as if they are idling wrong. They can’t have leaves pulled off as they rot like a cabbage or lettuce.
Hope is a dimension of the spirit. It is not outside us, but within us. When you lose it, you must seek it again within yourself and in people around you not in objects or even events.
Physics envy is, science envy is often found when fields wind up having trouble getting traction, and it’s usually because they’ve forgotten to do science rather than just take the forms of science without the actual substance.
[…] I believe that the only kind of science computing can be is like the science of bridge building. Somebody has to build the bridges and other people have to tear them down and make better theories, and you have to keep on building bridges.
Translated from Persian Shared experience of tragedy makes the hearts closer. That’s what connects us—you and me—you see?
Curious thing I’ve noticed about aesthetic satisfaction is that our pleasure is significantly enhanced when we, accomplish something with limited tools.
For example, the program of which I personally am most pleased and proud is a compiler I once wrote for a primitive minicomputer which had only 4096 words of memory, 16 bits per word. It makes a person feel like a real virtuoso to achieve something under such severe restrictions.
In medieval times, the first universities were established to teach the seven so-called “liberal arts,” namely grammar, rhetoric, logic, arithmetic, geometry, music, and astronomy.
Talking about the word “art” If we go back to Latin roots, we find ars, artis meaning “skill.” It is perhaps significant that the corresponding Greek word was , the τέχνη, root of both “technology” and “technique.”
Doug Engelbart died today. His work has always been very difficult for writers to interpret and explain.
Technology writers, in particular, tend to miss the point miserably, because they see everything as a technology problem. Engelbart devoted his life to a human problem, with technology falling out as part of a solution. When I read tech writers’ interviews with Engelbart, I imagine these writers interviewing George Orwell, asking in-depth probing questions about his typewriter.
[…] ACM’S Editorial Board made the following remark as they described the purposes of ACM’s periodicals: “If computer programming is to become an important part of computer research and development, a transition of programming from an art to a disciplined science must be effected.” [..] Meanwhile we have actually succeeded in making our discipline a science, and in a remarkably simple way: merely by deciding to call it “computer science.”
Implicit in these remarks is the notion that there is something undesirable about an area of human activity that is classified as an “art”; it has to be a Science before it has any real stature. He continues by defending the art
When I speak about computer programming as an art, I am thinking primarily of it as an art form, in an aesthetic sense. The chief goal of my work as an educator and author is to help people learn how to write beautiful programs […] My feeling is that when we prepare a program, the experience can be just like composing poetry or music […] Some programs are elegant, some are exquisite, some are sparkling. My claim is that it is possible to write grand programs, noble programs, truly magnificent ones! […] computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. Programmers who subconsciously view themselves as artists will enjoy what they do and will do it better.
As printing technology changed, the more important commercial activities were treated first and mathematicians came last. So our books and journals started to look very bad. I couldn’t stand to write books that weren’t going to look good.
The most dangerous thought that you can have as a creative person, is to think that you know what you’re doing. Because once you think you know what you’re doing, you stop looking around for other ways of doing things, and you stop being able to see other ways of doing things, you become blind.
Atheists are just modern versions of religious fundamentalists: They both take religion too literally.
In fact, “atheism” is a term that should not even exist. No one ever needs to identify himself as a “non-astrologer” or a “non-alchemist.” We do not have words for people who doubt that Elvis is still alive or that aliens have traversed the galaxy only to molest ranchers and their cattle. Atheism is nothing more than the noises reasonable people make in the presence of unjustified religious beliefs.
I think it’s really interesting that, the birth of modern mathematics is considered to be not any particular mathematical cencept, but a user interface.
Talking about Steve Jobs This sounds really simplistic, but it still shocks me how few people actually practice this—and it’s a struggle to practice—but is this issue of focus.
Steve was the most remarkably focused person I’ve ever met in my life.
You can achieve so much when you are truly focused.
And one of the things that Steve would say is: “How many things have you said no to?” And I would have these sacrificial things—because I wanted to be very honest about it, and so I say: “Oh, I said no to this, and no to that…” But he knew that I wasn’t vaguely interested in doing those things anyway. So there was no real sacrifice.
What Focus means… is saying NO to something that—with every bone in your body—you think is a phenomenal idea. And you wake up thinking about it… but you say NO to it because you’re focusing on something else.
Fridman asks: “When you go up to heaven and meet God and get to ask one question that would get answered, what question would you ask?” “What kind of browser do you have up there?”
Point eight is enough. “In fact I’ve concluded that it’s really a good thing for people not to be 100% happy. I’ve started to live in accordance with a philosophy that can be summed up in the phrase “Point eight is enough,” meaning “0.8 is enough.”
You might remember the TV show from the 70s called “Eight is Enough,” about a family with eight children. That’s the source of my new motto. I don’t know that 0.8 is the right number, but I do believe that when I’m not feeling 100% happy, I shouldn’t feel guilty or angry, or think that anything unusual is occurring. I shouldn’t set 100% as the norm, without which there must be something wrong. Instead, I might just as well wait a little while, and I’ll feel better. I won’t make any important decisions about my life at a time when I’m feeling less than normally good.
In a sense I tend now to suspect that it was necessary to leave the Garden of Eden. Imagine a world where people are in a state of euphoria all the time — being high on heroin, say. They’d have no incentive to do anything. What would get done? What would happen? The whole world would soon collapse. It seems like intelligent design when everybody’s set point is somewhere less than 100%.”
A person’s success in life is determined by having a high minimum, not a high maximum. If you can do something really well but there are other things at which you’re failing, the latter will hold you back. But if almost everything you do is up there, then you’ve got a good life. And so I try to learn how to get through things that others find unpleasant.
It is estimated that between 30 and 50 percent of the food grown in the United States goes to waste. Food is left in the field if it doesn’t meet certain cosmetic standards of large buyers, even if it is perfectly good quality. Supermarkets routinely overstock their produce shelves in deliberate displays of abundance, knowing that a portion will spoil and be thrown away. Globally, about one-third of food is wasted, amounting to about 1.8 billion tons and worth approximately $1 trillion.93 All of this wasted food means wasted water, labor power, energy, and all the other resources that went into making it.
World Bank economists calculate that the wealthiest 10 percent of the world’s population uses close to 60 percent of all the world’s resources. […] If this richest 10 percent reduced their consumption to the average consumption of the rest of humanity, total global resource use would be cut in half. […] A 2015 report by the British charity Oxfam found that the wealthiest 10 percent were responsible for half of all emissions of greenhouse gases, whereas the poorest half of the world’s people were responsible for about 10 percent.
A 2015 UN-sponsored study estimated the annual unpaid costs of global industrial agriculture at over $3 trillion—significantly more than the economic value of the food produced.
When the sportswear company Puma decided to “go green” and put together an environmental profit and loss account in 2011, it quickly found that, if implemented, the corporation would have to dissolve itself.
The first law of hydrodynamics is that water flows toward money.
In its history, the EPA has mandated safety testing for only a small percentage of the 85,000 industrial chemicals available for use today. And once chemicals are in use, the burden on the EPA is so high that it has succeeded in banning or restricting only five substances, and often only in specific applications: polychlorinated biphenyls, dioxin, hexavalent chromium, asbestos, and chlorofluorocarbons.
Capitalist production, by collecting the population in great centers, and causing an ever increasing preponderance of town population … disturbs the circulation of matter between man and the soil, i.e., prevents the return to the soil of its elements consumed by man in the form of food and clothing; it therefore violates the conditions necessary to lasting fertility of the soil.
Describing a meeting “typical of those which happen every day in the City of London” A group of Indonesian businessmen organized a lunch to raise £300 million to finance the clearing of a rain forest and the construction of a pulp paper plant. What struck me was how financial rationalism often overcomes common sense; that profit itself is a good thing whatever the activity, whenever the occasion. What happened to the Indonesian rain forest was dependent upon financial decisions made over lunch that day. The financial benefits would come to the institutions in London, Paris, or New York. Very little, if any, would go to the local people…. The rain forest may be geographically located in the Far East, but financially it might as well be located in London’s Square Mile.
When asked whether class war existed, billionaire investor Warren Buffett said: “There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.”
A stark choice faces humanity: save the planet and ditch capitalism, or save capitalism and ditch the planet.
They claim this mother of ours, the earth, for their own and fence their neighbors away; they deface her with their buildings and their refuse. The nation is like a spring freshet that overruns its banks and destroys all that are in its path.
Mankind thus inevitably sets itself only such tasks as it is able to solve, since closer examination will always show that the problem itself arises only when the material conditions for its solution are already present or at least in the course of formation.
The modern world worships the gods of speed and quantity, and of the quick and easy profit, and out of this idolatry monstrous evils have arisen.
The word “ecology” (originally œcology) was first coined in 1866 by Ernst Haeckel, Darwin’s leading German follower, based on the Greek word oikos, or household. Ironically, the word “economy,” to which ecology is often nowadays counterposed, was derived much earlier from the same Greek root—in this instance oikonomia, or household management. The close family relationship between these two concepts was fully intended by Haeckel, who defined ecology as the study of Darwin’s “economy of nature.”
In this universe of privately controlled education, each charter school can choose the curricula of its choice: Evolution is just a theory, the Bible is a literal history, dinosaurs and human beings simultaneously inhabited the earth, men are superior to women, white Christians to everyone else, and so on. Private and charter schools are like websites: they can foster any belief, shatter the idea that there is anything called truth,
Uber’s drivers are the R&D for Uber’s driverless future. They are spending their labor and capital investments (cars) on their own future unemployment.
I’m an engineer for the same reason anyone is an engineer: a certain love for the intricate lives of things, a belief in a functional definition of reality. I do believe that the operational definition of a thing—how it works—is its most eloquent self-expression.
Yet, when we allow complexity to be hidden and handled for us, we should at least notice what we are giving up. We risk becoming users of components, handlers of black boxes that do not open or don’t seem worth opening. We risk becoming people who cannot really fix things, who can only swap components, work with mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected. But when something breaks or goes wrong or needs fundamental change, what will we do except stand helpless in the face of our own creations
I fear for the world the Internet is creating. Before the advent of the web, if you wanted to sustain a belief in far-fetched ideas, you had to go out into the desert, or live on a compound in the mountains, or move from one badly furnished room to another in a series of safe houses. Physical reality—the discomfort and difficulty of abandoning one’s normal life—put a natural break on the formation of cults, separatist colonies, underground groups, apocalyptic churches, and extreme political parties.
But now, without leaving home, from the comfort of your easy chair, you can divorce yourself from the consensus on what constitutes “truth.” Each person can live in a private thought bubble, reading only those websites that reinforce his or her desired beliefs, joining only those online groups that give sustenance when the believer’s courage flags.
We build our computers the way we build our cities—over time, without a plan, on top of ruins.
Working on the “Up and Down the Ladder of Abstraction” project I’ve been feeling like a creative failure lately. There’s an essay in my head, it seems like an important one, and I’ve been trying to push it out of my head and onto a computer screen, and it’s clawing at the walls and biting my hands when I come near. Blegh blegh blegh. I can’t work. I can sort of force myself to work, but it’s not flowing the way it flows when it’s actually flowing. (And then the little demons crawl in and ask whether the final product will even be worth all of the effort, or if I’m just wasting my life and nobody will care etc etc..)
Today I ended up seeing Midnight In Paris and then the Pixar documentary, and the back-to-back Hemingway and Lasseter made me want to go Create Great Things, and then I thought about how my Great Things are going and instead I just want to bury myself in the backyard.
Traumatized people chronically feel unsafe inside their bodies: The past is alive in the form of gnawing interior discomfort. Their bodies are constantly bombarded by visceral warning signs, and, in an attempt to control these processes, they often become expert at ignoring their gut feelings and in numbing awareness of what is played out inside. They learn to hide from their selves.
If your parents’ faces never lit up when they looked at you, it’s hard to know what it feels like to be loved and cherished. If you come from an incomprehensible world filled with secrecy and fear, it’s almost impossible to find the words to express what you have endured. If you grew up unwanted and ignored, it is a major challenge to develop a visceral sense of agency and self-worth.
Coffee can be more enjoyable when shared.
If it’s inaccessible to the poor, it’s neither radical nor revolutionary.
Describing his similarities to Seiji Ozawa First of all, both of us seem to take the same simple joy in our work. Whatever differences there might be between making music and writing fiction, both of us are happiest when absorbed in our work. And the very fact that we are able to become so totally engrossed in it gives us the deepest satisfaction. What we end up producing as a result of that work may well be important, but aside from that, our ability to work with utter concentration and to devote ourselves to it so completely that we forget the passage of time is its own irreplaceable reward.
Secondly, we both maintain the same “hungry heart” we possessed in our youth, that persistent feeling that “this is not good enough,” that we must dig deeper, forge farther ahead. This is the major motif of our work and our lives. Observing Ozawa in action, I could feel the depth and intensity of the desire he brought to his work. He was convinced of his own rightness and proud of what he was doing, but not in the least satisfied with it. I could see he knew he should be able to make the music even better, even deeper, and he was determined to make it happen even as he struggled with the constraints of time and his own physical strength.
The third of our shared traits is stubbornness. We’re patient, tough, and, finally, just plain stubborn. Once we’ve decided to do something in a certain way, it doesn’t matter what anybody else says, that’s how we’re going to do it. And even if, as a result, we find ourselves in dire straits, possibly even hated, we will take responsibility for our actions without making excuses. Ozawa is an utterly unpretentious person who is constantly cracking jokes, but he is also extremely sensitive to his surroundings, and his priorities are clear. Once he has made his mind up, he doesn’t waver. Or at least that is how he appears to me.
Creative people have to be fundamentally egoistic. This may sound pompous, but it happens to be the truth. People who live their lives watching what goes on around them, trying not to make waves, and looking for the easy compromise are not going to be able to do creative work, whatever their field. To build something where there was nothing requires deep individual concentration, and in most cases that kind of concentration occurs in a place unrelated to cooperation with others, a place we might even call dämonisch.
It is worth remembering that the internet wasn’t supposed to be like this. It wasn’t supposed to be six boring men with too much money creating spaces that no one likes but everyone is forced to use because those men have driven every other form of online existence into the ground. The internet was supposed to have pockets, to have enchanting forests you could stumble into and dark ravines you knew better than to enter. The internet was supposed to be a place of opportunity, not just for profit but for surprise and connection and delight. Instead, like most everything American enterprise has promised held some new dream, it has turned out to be the same old thing a dream for a few, and something much more confining for everyone else.
When presented with objects that possess sharp angles or pointed features, a region of the human brain involved in fear processing, the amygdala, is activated. Likely a subconscious mechanism that evolved to detect potential threats, this fear response suggests that angular features influence the way in which objects are affectively and aesthetically perceived.
Secrets and mysteries hide best in an overly decorated and highly textured space. Mysteries prefer nooks and crannies, which can be found in abundance in Art Deco design, where patterns and prints throw curious shadows, deep tones and textures hide a multitude of sins, and secrets can be buried deep within. If the eye doesn’t know where to land, then it doesn’t know who to point the finger at.
“It’s astonishing how simple humans are,” says Pink to Romeo. “He becomes witness to a presidential assassination, his country is in uproar, his body is broken, but, hey, here’s the woman he fancies, and just like that he’s in a good mood.”
Peter had once heard that the AIs of modern rockets were modeled on the psyche of human suicide bombers. These intelligent weapons wanted to die a martyr’s death. Had someone convinced them that, in heaven, there would be seventy-two maintenance technicians for every one of them?
He wonders whether a schizophrenic would be taken seriously nowadays. “Doctor, I hear voices!” “Who doesn’t, Peter? Who doesn’t?”
“He said: the question today is how one can convince humanity to consent to their own survival.”
He glanced around at the motley collection of thugs, pimps, and record company executives.
Zaphod did not want to tangle with them and, deciding that just as discretion was the better part of valor, so was cowardice the better part of discretion, he valiantly hid himself in a closet.
This planet has—or rather had—a problem, which was this: most of the people living on it were unhappy for pretty much of the time. Many solutions were suggested for this problem, but most of these were largely concerned with the movements of small green pieces of paper, which is odd because on the whole it wasn’t the small green pieces of paper that were unhappy.
[…] Many were increasingly of the opinion that they’d all made a big mistake in coming down from the trees in the first place. And some said that even the trees had been a bad move, and that no one should ever have left the oceans.
“What’s the world’s greatest lie?” the boy asked, completely surprised.
It’s this: that at a certain point in our lives, we lose control of what’s happening to us, and our lives become controlled by fate. That’s the world’s greatest lie.
This single statement took the scientific world by storm. It completely revolutionized it. So many mathematical conferences got held in such good restaurants that many of the finest minds of a generation died of obesity and heart failure and the science of math was put back by years.