Inspired by this quickie analysis in which Fabio Rojas finds that 27% of variance in exam grades is explained simply by class attendance, and also by the recent end of my semester, I decided to take a look at how attendance affects grades in my big undergrad class. His class is on social theory, and I assume it’s just one thing; mine, by contrast, is a graphic design class made up of split but related lecture and lab components. I have a formal attendance score for lab, but only a proxy for lecture — how many quizzes they took.
The results? Pre-midterm quiz-taking has no significant relationship with scores on the midterm, which covers only lecture material; it accounts for only 1.7% of variance. On the other hand, lab attendance is hugely predictive of total points earned on lab projects, accounting for 43.5% of variance. I’m not terribly surprised by this, since lecture material is much easier to get outside of class than lab material is (primarily from the textbook), and because quiz-taking is an imperfect proxy than only takes into account who showed up at the beginning of class on quiz days. The crummy nature of the variable is somewhat confirmed by the fact that lab attendance is a significant predictor of midterm score, accounting for 7.2% of variance.
Filed: Leave Them Kids Alone || 12:53, December 20 || No Comments »
Of the pre-release excerpts from Walter Isaacson’s biography of Steve Jobs, the one that intrigued me the most was the one that related Jobs having “finally cracked” the next-generation television set. It was tightly related to the then-upcoming iCloud, but this bit of narration seems like the money quote: “No longer would users have to fiddle with complex remotes for DVD players and cable channels.”
Now rumors are rampant that Apple will soon unveil its TV, most recently suggesting it will come in three sizes (32″ at the low end, 55″ at the high end) and that the next iMac will bridge the gap by offering TV functionality. Key to all the rumors is the clear notion that Apple’s television set, unlike the extant Apple TV, will be an all-in-one set, not a set-top box. Jobs seemed pretty set in his belief that a set-top box wouldn’t work, even though one could do all the functionality he envisioned, and that Apple already sells a set-top box.
For a couple reasons, I’m not sure what to make of this. First, most households already have at least a couple of set-top boxes — the cable/satellite receiver and a video game console, which doubles as a media player. My living room has three set-top boxes (a DirecTV receiver, a PlayStation 3 and a Wii), and that’s not counting the VGA cable that sometimes connects my laptop to the TV. The second issue follows directly from this separation of functionality — I need to be able to update or replace my TV distinctly from my set-top boxes, or the entire enterprise becomes cost-prohibitive. The all-in-one model that was attractive to computer consumers who flocked to the iMac shouldn’t look nearly as viable in a market a) full of people who are already comfortable plugging cables into their TVs, and b) lacking in big educational customers who want all-in-one devices for ease of maintenance.
At the same time, Jobs was also thinking about interface complexity — getting rid of all your complicated remotes. Part of this is about getting Siri on your TV and using voice commands to control everything, but I’m pretty skeptical that most TV consumers will want to do that. Imagine that you just want to channel-surf on a Saturday afternoon; do you want to keep saying, “next channel,” until you find something worth stopping for? Or you’re flipping through several football games to follow your fantasy team; is repeatedly calling out channel numbers an attractive control scheme?
Of course, none of this takes content availability into account. I really wonder whether the success Jobs had manhandling the big record labels left him overconfident in his ability to bend TV and movie studios to his will. The reason why most people have multiple set-top boxes is that they want access to multiple, exclusive content libraries. I want a broad selection of channels, including premium channels, to be available to me live; I need a cable or satellite receiver. I want to play Little Big Planet; I need a PlayStation 3. I want to play Super Mario Galaxy; I need a Wii. For other things I have other choices — if I want to play Blu-rays or watch Netflix, I can use my PS3 rather than add another device — but even the hyper-converged, iCloud-driven Apple TV isn’t going to get rid of the basic functionality of my existing equipment. So I’m still left wondering — unlike with the iPod, iPhone or iPad, but much like the current Apple TV — what the point of this device is. Voice commands and better user interfaces can be added to any existing box, and streaming video will be a central component of every TV device that’s ever designed from here on out. What benefit do I get from buying a TV that’s hardwired into that content control functionality?
Filed: aka Syscrusher || 15:50, December 9 || No Comments »
Apologies in advance, but if you don’t follow sports this post may not make much sense.
This afternoon, the Denver Broncos picked up their sixth win in seven games this season with Tim Tebow starting at quarterback. If you haven’t heard, Tebow has what might be called non-traditional passing mechanics, but as many commentators have noted, he “just wins.” There’s a lot that could be said, and already has been said, about the strange way in which quarterbacks are credited with team success in football, but that’s not really the point of this post. Rather, I want to point out how odd it is that seven games — even seven games that include six wins — can be considered so meaningful in football.
This stretch has turned Denver’s season around, to be sure. They lost four of their first five, but now find themselves tied for the lead in their division. But this is only possible because of the NFL’s relatively tiny schedule. Consider that, for a hockey goaltender — probably the only every-game player in North American major team sports with as much impact as a quarterback — six wins in seven games is barely noticeable; it’s less than a tenth of the season. For a baseball player (where there isn’t such a great analogue, since starting pitchers only go every fifth game), six wins in seven games is a good week. You could win your league’s player of the week award in May and be sent down to the minors in June. For Tebow, six wins in seven games is two months and half the season, and it’s especially significant when one of your divisional rivals (San Diego) is imploding at the same time.
But this is all perception; if we’re trying to think about what this seven-game sample means in terms of predicting the larger population of games that is a player’s career, seven tells us nothing. It doesn’t matter that an NFL season is only 16 games long; seven games don’t provide enough observations to reduce the error to an acceptable level. If we were to look at a proportionally similar number of baseball games — 70 — we’d keep the proportion the same, but reduce the sampling error by examining nominally more cases.
So where does this perception error come from? Is it just the kind of rank innumeracy we see in many contexts? Maybe, but I suspect there’s also an important media effect as well. Sports media — both reporters and game broadcasters — and the sports culture they’re embedded in frequently express hostility toward data-driven strategy. Narratives and tradition rule in sports, and when data contradict them it’s because data can’t possibly figure out the relevant “intangibles.” Noting that a seven-game span isn’t really an illuminating sample gets in the way of a lot of narrative structure.