Pythonista, programmer in a brain & cogsci department, flaming liberal
31 stories

Save The X-Files from itself. Replace its creator.

1 Share

Chris Carter has great ideas. He increasingly struggles to execute them.

Someone needs to save The X-Files from itself.

Okay, it's not in any imminent danger of ending, as it has been in the past. The recently concluded six-episode miniseries posted strong ratings (though they trended downward), so there will likely be more episodes at some point. And the core of the show is still strong. David Duchovny and Gillian Anderson are as terrific together as ever, and the miniseries even produced one incredibly strong installment in "Mulder and Scully Meet the Were-Monster."

The problem is that even though the miniseries had its moments, fully half of it was an absolute disaster, with three episodes that served as reminders of why the show eventually left the air in the first place.

The season premiere and finale both attempted to reinvent The X-Files' alien conspiracy and only revealed how threadbare that whole story had become. Meanwhile, the fifth episode — "Babylon" — was only saved by its status as a fascinating misfire, which tried to engender sympathy for Islamic terrorists and ended up doing roughly the opposite.

And all three of those episodes have one big thing in common — they were written and directed by series creator Chris Carter. Who must be stopped.

It wasn't always like this!

Mulder shouts! Fox
Just imagine Mulder shouting the above subhead.

Carter was never the strongest writer on his own show, but he was frequently at least a competent one in the show's first several seasons. And his direction was often terrific, especially in the black-and-white comedy "The Post-Modern Prometheus" (season five) and "Triangle" (season six), an episode that seems to consist of several very long, uninterrupted shots.

He's also always been a tremendous ideas man. The very concept of The X-Files is beautiful and elegant in its simplicity. It's one of the great TV premises of all time, and it's elastic enough to theoretically run forever.

Even in the miniseries, you can see that Carter still has good ideas. Building the new conspiracy from the ruins of the old one was a potent way of commenting on the existence of The X-Files as a miniseries, built atop the ruins of the old show. And his eye for casting remains — Robbie Amell and Lauren Ambrose as a sort of new Mulder and Scully duo were an inspired pairing.

But where Carter increasingly falls apart is in the execution of his good ideas. His writing has always trended toward the overly expository, but now exposition is all it is. His direction has fared better, but even in that he's going through the motions. And while Amell and Ambrose were great actors for the roles they played, the roles themselves ended up being nothing to write home about.

Even if you look back at Carter's '90s career, his best ideas usually emerged when others were helping steer the ship. The X-Files' writers' room was full of legendary scribes, while his follow-up series, Millennium, was at its best in its second season, when Carter stepped back and Glen Morgan and James Wong took over. (Carter returned for Millennium's third — and worst — season.)

None of this means Carter's work is without merit — after all, he created The X-Files. But sometimes things get lost between his head and the screen, especially if nobody else is around to help channel his ideas into a form that makes sense as a story.

So the best approach here is simple: Let Carter do what he does best by coming up with ideas. Then, whenever possible, hire others to shepherd those ideas to the screen.

A modest proposal for the future of The X-Files

The Lone Gunmen on The X-Files. Fox
Carter is still capable of great strangeness, as with the episode "Babylon," so it's worth letting him goof off a little bit.

Realistically, Chris Carter is never going to leave The X-Files entirely. It's his baby, and TV networks don't usually like changing captains on a hit show, so long as showrunners are turning things in on time and on budget. Also, it's highly unlikely that the alien conspiracy storyline will ever go away — even if it's a horrible mess.

But Carter probably shouldn't be writing and directing fully half of any future X-Files miniseries that happen. When the series was doing 22 to 25 episodes per season, Carter realistically couldn't do that. However, with a much more relaxed shooting schedule on the recent miniseries, he was able to, and that only further exposed some of the weaknesses in his work that have always been present.

My proposal would be to let Carter stick around but hire someone to help him out as a sort of guardian of the conspiracy storyline (a role that writer Frank Spotnitz — now on Amazon's The Man in the High Castleused to fill on the original series). "Babylon" was weird enough to make me think Carter might still strike gold now and then.

He mentioned at a press event I attended that he would love to do a musical episode, and that he has an idea for a bottle episode he's never been able to pull off (one that he might try in a theoretical season 11; the miniseries was season 10). And since I often like Carter in experimental mode, I don't have a problem with him writing and directing an hour or two of whatever form The X-Files takes next. But not half!

The odds seem pretty good that if Fox orders more X-Files, it will also order more episodes. The show won't get a full 22-episode season, but 10 or even 13 episodes don't seem out of the question. With that much space to fill, Carter will almost certainly have to bring in more writers than the skeleton crew that composed the miniseries.

Obviously, he could just rehire the many illustrious writers who produced the original nine seasons — including Breaking Bad and Better Call Saul creator Vince Gilligan, for instance.

But Carter has always had a tremendous eye for talent when it comes to writers, too. So why not do a season written half by the old guard and half by new writers — either established TV hands who love the show or brand new people that Carter recruits? There must be tons of writers out there who would be hungry for the chance.

Wouldn't you love to see an X-Files episode written by Hannibal's Bryan Fuller, or a take on the conspiracy from The Americans' Joe Weisberg and Joel Fields? Wouldn't it be interesting to explore what The X-Files might look like if it were written by a woman, or a person of color, or someone barely in their 20s, who only knows the show from binge-watching it on Netflix?

Carter, of all people, can find great writers who haven't broken into the industry just yet, and if he gets to make more X-Files, he should try to further his greatest legacy: giving lots and lots of terrific writers their biggest break.

Read the whole story
2320 days ago
Rochester, NY
Share this story

Today you can get 2GB of Google Drive storage for free

1 Comment and 2 Shares

You can never have too much Google Drive space, right? At least that's what Google thinks, and it's giving away 2GB today to anyone that wants it.

This has become a yearly tradition for the company, with today's offer specifically honoring Safer Internet Day 2016. To get the extra Drive space, all you have to do is sign into your Google account and review your security settings, including factors like two-step verification, authorized devices, account verification settings, and a couple more. The process takes just a few minutes, and once you complete the check-up, you'll be awarded the extra 2GB of permanent Drive space for free.

Those who took advantage of the same Google promotion last year can rejoice, as they are welcome to snatch up this year's promotion as well. There's no word on when this 2GB offer will expire, but you have at least one week to complete the security check-up.

Read on Ars Technica | Comments

Read the whole story
2335 days ago
Rochester, NY
Share this story
1 public comment
2335 days ago
Well that was quick and painless, and 2 free GB!
Somerville, MA

How to undo (almost) anything with Git


One of the most useful features of any version control system is the ability to "undo" your mistakes. In Git, "undo" can mean many slightly different things.

When you make a new commit, Git stores a snapshot of your repository at that specific moment in time; later, you can use Git to go back to an earlier version of your project.

In this post, I'm going to take a look at some common scenarios where you might want to "undo" a change you've made and the best way to do it using Git.

Undo a "public" change

Scenario: You just ran git push, sending your changes to GitHub, now you realize there's a problem with one of those commits. You'd like to undo that commit.

Undo with: git revert <SHA>

What's happening: git revert will create a new commit that's the opposite (or inverse) of the given SHA. If the old commit is "matter", the new commit is "anti-matter"—anything removed in the old commit will be added in the new commit and anything added in the old commit will be removed in the new commit.

This is Git's safest, most basic "undo" scenario, because it doesn't alter history—so you can now git push the new "inverse" commit to undo your mistaken commit.

Fix the last commit message

Scenario: You just typo'd the last commit message, you did git commit -m "Fxies bug #42" but before git push you realized that really should say "Fixes bug #42".

Undo with: git commit --amend or git commit --amend -m "Fixes bug #42"

What's happening: git commit --amend will update and replace the most recent commit with a new commit that combines any staged changes with the contents of the previous commit. With nothing currently staged, this just rewrites the previous commit message.

Undo "local" changes

Scenario: The cat walked across the keyboard and somehow saved the changes, then crashed the editor. You haven't committed those changes, though. You want to undo everything in that file—just go back to the way it looked in the last commit.

Undo with: git checkout -- <bad filename>

What's happening: git checkout alters files in the working directory to a state previously known to Git. You could provide a branch name or specific SHA you want to go back to or, by default, Git will assume you want to checkout HEAD, the last commit on the currently-checked-out branch.

Keep in mind: any changes you "undo" this way are really gone. They were never committed, so Git can't help us recover them later. Be sure you know what you're throwing away here! (Maybe use git diff to confirm.)

Reset "local" changes

Scenario: You've made some commits locally (not yet pushed), but everything is terrible, you want to undo the last three commits—like they never happened.

Undo with: git reset <last good SHA> or git reset --hard <last good SHA>

What's happening: git reset rewinds your repository's history all the way back to the specified SHA. It's as if those commits never happened. By default, git reset preserves the working directory. The commits are gone, but the contents are still on disk. This is the safest option, but often, you'll want to "undo" the commits and the changes in one move—that's what --hard does.

Redo after undo "local"

Scenario: You made some commits, did a git reset --hard to "undo" those changes (see above), and then realized: you want those changes back!

Undo with: git reflog and git reset or git checkout

What's happening: git reflog is an amazing resource for recovering project history. You can recover almost anything—anything you've committed—via the reflog.

You're probably familiar with the git log command, which shows a list of commits. git reflog is similar, but instead shows a list of times when HEAD changed.

Some caveats:

  • HEAD changes only.HEAD changes when you switch branches, make commits with git commit and un-make commits with git reset, but HEAD does not change when you git checkout -- <bad filename> (from an earlier scenario—as mentioned before, those changes were never committed, so the reflog can't help us recover those.
  • git reflog doesn't last forever. Git will periodically clean up objects which are "unreachable." Don't expect to find months-old commits lying around in the reflog forever.
  • Your reflog is yours and yours alone. You can't use git reflog to restore another developer's un-pushed commits.


So... how do you use the reflog to "redo" a previously "undone" commit or commits? It depends on what exactly you want to accomplish:

  • If you want to restore the project's history as it was at that moment in time use git reset --hard <SHA>
  • If you want to recreate one or more files in your working directory as they were at that moment in time, without altering history use git checkout <SHA> -- <filename>
  • If you want to replay exactly one of those commits into your repository use git cherry-pick <SHA>

Once more, with branching

Scenario: You made some commits, then realized you were checked out on master. You wish you could make those commits on a feature branch instead.

Undo with: git branch feature, git reset --hard origin/master, and git checkout feature

What's happening: You may be used to creating new branches with git checkout -b <name>—it's a popular short-cut for creating a new branch and checking it out right away—but you don't want to switch branches just yet. Here, git branch feature creates a new branch called feature pointing at your most recent commit, but leaves you checked out to master.

Next, git reset --hard rewinds master back to origin/master, before any of your new commits. Don't worry, though, they are still available on feature.

Finally, git checkout switches to the new feature branch, with all of your recent work intact.

Branch in time saves nine

Scenario: You started a new branch feature based on master, but master was pretty far behind origin/master. Now that master branch is in sync with origin/master, you wish commits on feature were starting now, instead of being so far behind.

Undo with: git checkout feature and git rebase master

What's happening: You could have done this with git reset (no --hard, intentionally preserving changes on disk) then git checkout -b <new branch name> and then re-commit the changes, but that way, you'd lose the commit history. There's a better way.

git rebase master does a couple of things:

  • First it locates the common ancestor between your currently-checked-out branch and master.
  • Then it resets the currently-checked-out branch to that ancestor, holding all later commits in a temporary holding area.
  • Then it advances the currently-checked-out-branch to the end of master and replays the commits from the holding area after master's last commit.

Mass undo/redo

Scenario: You started this feature in one direction, but mid-way through, you realized another solution was better. You've got a dozen or so commits, but you only want some of them. You'd like the others to just disappear.

Undo with: git rebase -i <earlier SHA>

What's happening: -i puts rebase in "interactive mode". It starts off like the rebase discussed above, but before replaying any commits, it pauses and allows you to gently modify each commit as it's replayed.

rebase -i will open in your default text editor, with a list of commits being applied, like this:


The first two columns are key: the first is the selected command for the commit identified by the SHA in the second column. By default, rebase -i assumes each commit is being applied, via the pick command.

To drop a commit, just delete that line in your editor. If you no longer want the bad commits in your project, you can delete lines 1 and 3-4 above.

If you want to preserve the contents of the commit but edit the commit message, you use the reword command. Just replace the word pick in the first column with the word reword (or just r). It can be tempting to rewrite the commit message right now, but that won't work—rebase -i ignores everything after the SHA column. The text after that is really just to help us remember what 0835fe2 is all about. When you've finished with rebase -i, you'll be prompted for any new commit messages you need to write.

If you want to combine two commits together, you can use the squash or fixup commands, like this:


squash and fixup combine "up"—the commit with the "combine" command will be merged into the commit immediately before it. In this scenario, 0835fe2 and 6943e85 will be combined into one commit, then 38f5e4e and af67f82 will be combined together into another.

When you select squash, Git will prompt us to give the new, combined commit a new commit message; fixup will give the new commit the message from the first commit in the list. Here, you know that af67f82 is an "ooops" commit, so you'll just use the commit message from 38f5e4e as is, but you'll write a new message for the new commit you get from combining 0835fe2 and 6943e85.

When you save and exit your editor, Git will apply your commits in order from top to bottom. You can alter the order commits apply by changing the order of commits before saving. If you'd wanted, you could have combined af67f82 with 0835fe2 by arranging things like this:


Fix an earlier commit

Scenario: You failed to include a file in an earlier commit, it'd be great if that earlier commit could somehow include the stuff you left out. You haven't pushed, yet, but it wasn't the most recent commit, so you can't use commit --amend.

Undo with: git commit --squash <SHA of the earlier commit> and git rebase --autosquash -i <even earlier SHA>

What's happening: git commit --squash will create a new commit with a commit message like squash! Earlier commit. (You could manually create a commit with a message like that, but commit --squash saves you some typing.)

You can also use git commit --fixup if you don't want to be prompted to write a new commit message for the combined commit. In this scenario, you'd probably use commit --fixup, since you just want to use the earlier commit's commit message during rebase.

rebase --autosquash -i will launch an interactive rebase editor, but the editor will open with any squash! and fixup! commits already paired to the commit target in the list of commits, like so:


When using --squash and --fixup, you might not remember the SHA of the commit you want to fix—only that it was one or five commits ago. You might find using Git's ^ and ~ operators especially handy. HEAD^ is one commit before HEAD. HEAD~4 is four commits before HEAD - or, altogether, five commits back.

Stop tracking a tracked file

Scenario: You accidentally added application.log to the repository, now every time you run the application, Git reports there are unstaged changes in application.log. You put *.log in the .gitignore file, but it's still there—how do you tell git to to "undo" tracking changes in this file?

Undo with: git rm --cached application.log

What's happening: While .gitignore prevents Git from tracking changes to files or even noticing the existence of files it's never tracked before, once a file has been added and committed, Git will continue noticing changes in that file. Similarly, if you've used git add -f to "force", or override, .gitignore, Git will keep tracking changes. You won't have to use -f to add it in the future.

If you want to remove that should-be-ignored file from Git's tracking, git rm --cached will remove it from tracking but leave the file untouched on disk. Since it's now being ignored, you won't see that file in git status or accidentally commit changes from that file again.

That's how to undo anything with Git. To learn more about any of the Git commands used here, check out the relevant documentation:

Read the whole story
2581 days ago
Rochester, NY
Share this story

lacour and the opportunity costs of intransigent irb reviews

1 Share

Of all of the issues brought up by the Lacour controversy, we have not devoted enough attention to one in my view. The Yale Columbia* IRB made itself part of this problem.

In his initial comments to Retraction Watch, Lacour’s coauthor and Columbia political science professor Donal Green wrote,

Given that I did not have IRB approval for the study from my home institution, I took care not to analyze any primary data – the datafiles that I analyzed were the same replication datasets that Michael LaCour posted to his website. Looking back, the failure to verify the original Qualtrics data was a serious mistake.

This points to a real cost imposed by intransigent IRBs that become significant hurdles for research to progress. As institutions evaluate their response to this affair, and we reevaluate our own approaches to collaboration, those efforts would not be complete without considering the fact that IRBs hinder good, ethical research.

The subtext of Green’s comments boil down to the idea that obtaining IRB approval for collaboration with a junior colleague were sufficiently high that he didn’t feel it was worth his investment to do so.

But given how bloated most IRBs have become, and the onerous tasks required for various reasons — intrusion on academic freedom, board members not knowing particular research methods, mission creep, the desire to give scholarly rather than ethical advice — researchers now find ways of avoiding going through the process. The real-life ethical harm caused by potentially fraudulent research being accepted as true scientific findings, however, were worse than the potential harm to “respondents” that the IRB would have taken it upon itself to review.

In other words, we often fail to account for the opportunity costs imposed by overly restrictive IRB policies.

One can criticize Green for avoiding the IRB. In fact, in a private conversation (well, on a Facebook group), a friend, who is a mid-career political scientist with an outstanding reputation, did just that. She put a decent size of the blame at Green’s feet for this incident. But given the intransigence of IRBs and the ridiculous lengths to which they go, I can imagine doing the same thing that Green did to stay in the “letter of law,” if even that meant violating the spirit in some sense.

Institutions and professional organizations (oh hai ASA) should seriously seek to reform their IRBs. The AAUP has some recommendations, including eliminating many social scientific methods from IRB review. If institutions do not want to take the full step that the AAUP recommends of eliminating review for several social science methods, I would at least like to see institutions separate medical review from social science review boards. Being at an institution without a medical school, I appreciate the difference in the IRB review process here. On the one hand, it responds promptly and understands social scientific methods. On the other, or because of those reasons, it has caught hidden but real problems and provided useful ethical guidance.

I would ask Columbia: what became a larger liability to your institutional reputation, a stringent IRB review for a study with absolutely minimal risk, or being mired in a retraction scandal caused — in part1 — by following the rule of IRB policy?

Update: I confused the institution of Peter Aronow, the Yale coauthor of the Irregularities report, with that of LaCour coauthor Donald Green (Columbia). I regret impugning Yale’s good name.

  1. I understand the extent of fraud in this case — if true — would be difficult to detect even if Green had IRB approval. That said, I worry less about outright fraud and more about unintentionally bad science that seeps out because IRBs make collaborating across institutions difficult. I also realize that Lacour has not yet had a chance to respond. But, again, the problem imposed by the IRB still stands, even if Lacour turns out a spirited defense.  

Read the whole story
2597 days ago
Rochester, NY
Share this story


4 Comments and 16 Shares
I need an extension for my research project because I spent all month trying to figure out whether learning Dvorak would help me type it faster.
Read the whole story
2791 days ago
Rochester, NY
Share this story
4 public comments
2752 days ago
So so true!
2783 days ago
iPhone: 47.398945,8.541090
2791 days ago
"A" vs "B" vs "A vs B"
on a bike
2791 days ago
I'll think about this tomorrow

Brands of Nonsense

1 Comment

That’s the title of a piece of mine the Chronicle of Higher Education ran a little while ago. It’s paywalled but they have graciously given me permission to republish it here.

A little while ago, the University of Warwick was in the newsfor all the wrong reasons. Its longstanding legal firm, SGH Martineau, put up a blog post suggesting that universities should take action against “insubordinate” academics with “outspoken opinions.” The firm stressed the importance of making an example of offenders whose academic work was “brilliant,” lest other employees become tempted to emulate them.

Unfortunately for Warwick, this suggestion was made at precisely the time the university was seeking to remove an insubordinate professor, whose alleged offensesincluded “sighing” and “irony during job interviews”, though it appears his real offense was criticism of the British government’s higher-education policy.

The law firm’s post was couched in terms of the possibility of damage to the university’s “brand.” Universities have always been rightly concerned about their reputations. But the conversion in recent years to the language of branding has reached a fever pitch. Of course, in Warwick’s case, both the proposal to muzzle academics and the marketing-speak used to justify it did enough damage to offset, for some time, the efforts of its entire central-administration communications team,which employs almost 30 people, not to mention similar personnel in various schoolsand departments.

In Australia, Monash University proudly announcedthis year that it was the first organization in the world to acquire a “brand” top-level domain name—that is, an Internet address ending in “.monash” rather than the previous “” This trivial change cost $180,000,plus an annual fee of $25,000, and is part of the university’s expensively maintained “brand identity policy.”

In America, the University of Pennsylvania was an early adopter of this approach. In 2002 the Pennsylvania Gazette celebrated its centenary with a history titled “Building Penn’s Brand.” What might Penn’s most eminent sociologist, Erving Goffman (author of The Presentation of Self in Everyday Life), have made of this adoption of the language of image and “brand”?

Many American universities now have branding policies, and some affirm an unqualified commitment to the associated marketing ideology. The University of Florida, for example stateson its website:

“The importance of having a clear, recognizable brand can never be overstated. It defines us, separates us and communicates our relevance and value. It is especially important in an environment as vast and decentralized as the University of Florida. Thousands of messages leave the university every day, and each represents an opportunity to enhance—or fragment—our image. By maintaining consistent standards, we capitalize on the enormous volume of communications we generate and we present an image to the world of a multifaceted, but unified, institution.”

That statement summarizes all the key points of the ideology of branding. First there is the emphasis on image without any reference to an underlying reality. Second there is the assumption that the university should be viewed as a corporate institution rather than as a community. Third there is the desire to subordinate the efforts of individual scholars in research, extension, and community engagement to the enhancement of the corporate image. And finally there is the emphasis on distinctiveness and separateness. The University of Florida does not want to seem part of a global community of higher education, but rather as a competitor in a crowded marketplace.

Before considering this process further, we need some context. The authority on the history of the corporation and of brands is Alfred D. Chandler, whose books Strategy and Structure: Chapters in the History of the American Industrial Enterprise(MIT Press, 1962), The Visible Hand: The Managerial Revolution in American Business(Harvard University Press, 1977), and Scale and Scope: The Dynamics of Industrial Capitalism(Harvard University Press, 1990) are the definitive studies of the rise of the managerial corporation.

Chandler emphasizes the emergence of packaged and branded goods. Until the late 19th century, products like foodstuffs were sold in bulk by wholesalers, then measured out by retailers to individual customers. At every stage there were opportunities to increase profits by passing off a cheaper alternative for the good being sold. Shopkeepers’ reputations were the primary warranty. In the increasingly urban and mobile environment of the late 19th century, reputation, never a fully effective seal of quality, became even less so.

Branded products provided a solution. Now it was possible for consumers to repeatedly buy the same brand of product at various stores. The brand was a guarantee of consistent quality, not because of the trustworthiness of the corporation (of which the buyers would typically know nothing), but because its value depended on consistency and quality.

Consistency was the more important of the two. A low-quality product, provided that it was consistently adequate and appropriately priced, could benefit just as much from a brand as a higher-quality, more expensive alternative could. Indeed, the wealthy were the last to embrace branded products, instead patronizing bespoke tailors and personal providers of food and other services long after the middle and working classes were used to doing their shopping at Macy’s and A&P.

The great marketing discovery of the 20th century, pioneered by the advertising titan J. Walter Thompson, was that brands could do much more than guarantee a consistent level of objective quality. With the right advertising, a brand could come to embody connotations of all kinds, unrelated to the qualities of the product to which it was attached. Femininity or masculinity, luxury or solid good sense, excitement or security—all of these and more are part of “image.”

A third form of brand value arises when there are strong forces for customer “loyalty,” amounting, in some cases, to “lock-in.”For example, anyone who wants to use computers of designs descended from the IBM PC has little choice but to buy Microsoft operating systems like Windows.

And now we come to what may be the most striking feature of branding in higher education. Universities are corporate bodies, but they predate commercial corporations by many centuries. Long before the advent of packaged and branded goods, universities were certifying the quality of their students through the awarding of degrees.

Many criticisms of corporate branding apply equally to university degrees, and much of the voluminous literature on “credentialism” could be translated into the language of branding. The aim of degrees is, after all, to certify quality in the sense that a student has completed a course of study and acquired the associated knowledge and reasoning skills. And, as with brands that involve monopoly power, many degrees gain value from the fact that they are required for entry to particular professions. On the other hand, and with notable exceptions like the M.B.A., there has been little consistent effort to promote “brand image” to potential employers. Like a 19th-century brand, the degree has, in large part, gained its value from graduates rather than vice versa.

The rise of corporate-style branding has gone hand in hand with the devaluation of degrees through grade inflation. Grades in the A range have become the normat leading universities. Reports that Princeton might roll back attemptsto cap the proportion of A’s at 35 percent cite administrators’ fears that the policy discourages potential applicants and students’ complaints that it hurts their chances of getting jobs, fellowships, or spots in graduate or professional schools.

The “brand value” or “brand equity” of a company can be estimated as the intangible capital, beyond the company’s actual earnings, that may arise, as Chandler suggests, in three ways:

* [if !supportLists]·[endif] The company is known to produce goods and services of a higher quality than competitors of similar cost (or similar quality at lower cost). Remember? That’s the 19th-century notion of brand.

* [if !supportLists]·[endif] The brand reflects intangible attributes, through advertising, in the minds of consumers. That’s the early

20th century notion.* 20thcentury notion.[if !supportLists]·[endif]

A brand’s component products work best together or with those made by partner brands. That’s the late-20th-century lock-in notion.

It is appropriate, therefore, that the world’s most valuable brand is Apple, because it hits the trifecta. It is widely perceived as the highest-quality and most consistently innovative maker of computing devices. Its products carry a cachet of sophistication emphasized by the famous “I’m a Mac … and I’m a PC”ads. And (except for a brief period in the 1990s when Macintosh “clones” were marketed on a small scale) anyone who wants to use Apple operating systems has to buy an Apple device, and vice versa.

How do those concepts apply to universities and, in particular, to undergraduate education, which remains the core business of most of these institutions?

The 19th-century notion of quality is established in the minds of students, parents, and just about everybody else. In fact, it is so well established that rankings of leading universities have barely changed since the hierarchy was established, in the second half of the 19th century. A blog postby the sociologist Kieran Healyon Crooked Timber compares a ranking produced in 1911 with the most recent U.S. News rankings and finds a close correlation (except that elite private universities, as a group, have improved their status relative to state flagships).

In that sense, then, university brands are strong. But brand relativities that endure regardless of the competence of university leaders, the vagaries of scholars and departments, and the efforts of marketing departments are not really of much interest.

None of this is to say that there are no differences in quality among those captured by these very stable rankings. At any given time, the quality of departments in any university will vary widely. Some will be making great strides in teaching and research. Others will be riven by internal divisions, or wedded to outdated and discredited approaches to pedagogy and research methodology. But there is no way to discover such things from branding exercises at the university level.

Key branding efforts focus on intangibles. In this respect, university branding has been an embarrassing failure both by the industrial standards of the advertising sector and by the intellectual standards that universities are supposed to uphold. For example, virtually every Australian university has adopted (replacing the Latinate motto that used to adorn its crest) a branding slogan: “Know more. Do more.”“Where brilliant begins”. Good luck trying to match a particular slogan with its respective university. (Disclosure: I am, perhaps, bitter that my own proposed branding slogan—”UQ, a university not a brand”—did not find favor with my institution’s marketing department.)

Finally there is the question of lock-in. A university degree is a required ticket for entry to many professions, and where state-level licensing applies, the range of choices may be limited. At the top end, access to various elite jobs is confined largely to the products of Ivy League and similarly elite institutions. That is a form of lock-in that adds to “brand value,” but in a socially unproductive way.

Branding, as applied to higher education, is nonsense. Colleges are disparate communities of scholars (both teachers and students) whose collective identity is largely a fiction, handy during football season but of little relevance to the actual business of teaching and research. The suggestion that a common letterhead and slogan can “present an image to the world of a multifaceted, but unified, institution” is comforting to university managers but bears no correspondence to reality.

The idea of universities as corporate owners of brands is directly at odds with what John Henry Newman called “the Idea of a University.”To be sure, that idea is the subject of contestation and debate, but in all its forms it embodies the ideal of advancing knowledge through free discussion rather than burnishing the image of a corporation. In the end, brands and universities belong to different worlds.

John Quiggin is a fellow in economics at the University of Queensland, in Australia; a columnist for The Australian Financial Review; a blogger for Crooked Timber; and the author of Zombie Economics: How Dead Ideas Still Walk Among Us (Princeton University Press, 2010).

Read the whole story
2801 days ago
A good piece on the increasing toxic coroporatization of higher ed. Now they're all about #brands. ::sigh::
Rochester, NY
Share this story
Next Page of Stories