Young Economics.

Archive for August 2009

Genealogy Problem

with 3 comments

Problem of the Day: Just to keep our minds nimble through the summer I thought I’d post a little probability puzzle. My dad has recently been studying genealogy. He remarked to me yesterday that in one line of our family tree the same last name persisted for 500 years. I was amazed at first, but then began to wonder what the expected number of generations a last name sticks around for. So here’s the setup/assumptions of this toy problem.

  • Each couple has C children that marry and reproduce.
  • There’s equal probabilities of those children being male or female.
  • Females don’t carry on the last name (there’s no incest, females marry males with different last names, and wives always take husbands’ last names).
  • We start with a married male with the last name EconLove.

Part 1: Work out the probability that the name EconLove will survive for at least Y additional generations. (Hint, I didn’t have a reduced form equation, but instead a recursive one.)
Part 2: For C=2, what is cut-off number of generations where the probability of EconLove surviving that number of generations drops below 1/2? For what values of C will the probability of EconLove lasting Y generations never drop below 1/2 for any Y? (Hint, there’s an intuitive reason for this last one).

I’ll wait a couple days to see if people have questions, and then I’ll post my answers.

Written by Brian Quistorff

August 19, 2009 at 12:02 pm

Posted in Uncategorized

A reading for the Beer Appreciation Society

leave a comment »

[A] study of almost 1,700 women, published in the journal Nutrition, found bone density was better in regular drinkers than non-drinkers.

Of course, there are caveats.  They are in the article, but for the purposes of beer appreciation propaganda, we need not report them here.

Written by Alex

August 17, 2009 at 6:15 pm

The Überdistortion

leave a comment »

From Create Your Own Economy, the new book by Tyler Cowen of the Marginal Revolution blog:

Standard behavioral economics views “framing effects” as distorting our decisions, but in many circumstances framing effects help make our lives real, vivid, and meaningful, just as Twittering can make our smallest choices more salient.  We choose to send or receive messages in particular ways, in part, to determine which kinds of framing effects will influence our thoughts and emotions.

. . .

The better way to understand human imperfections is to focus on what I call an überdistortion, namely that we, when selecting from a broad menu of options, don’t always make the right choice of framing effects.  In other words, if you want to make better decisions, you should be more self-reflective about how you are choosing to frame the messages you send and receive.

. . .

Competition gives you the chance to construct the whirlwind of influences that you most prefer.  For that process to work smoothly, try to avoid the überbias of picking the wrong framing effects.  Focus your wisdom on choosing the right media for your messages.  (pp. 78-79, 89)

(Obviously, the link to Wikipedia was inserted by me and is not in the book.)

Cowen is writing about the ways in which the information age allows us to customize our own individual relationship to the world by choosing particular ways of filtering the bits of information (cultural goods, friendships, news, etc.) that we consume.  In the above excerpt, he suggests that the meta-level choice of how we frame information is the important choice to get right if one wants to live a meaningful life in the information age — more important, perhaps, than the information itself.

Is this good advice?  What principles should one apply in choosing the ‘best’ frames for oneself as a consumer of information?  Does explicitly recognizing and embracing one’s own frames imply the abandonment of disinterested impartiality in information consumption (even if that was always just a pretence anyway)?  If everyone thought this way about the information they consume, would we all become dangerously postmodern?

The book is pretty good so far.

Written by Alex

August 17, 2009 at 3:13 pm

Posted in Uncategorized

Tagged with , ,

Economic inequality in Canada

leave a comment »

Professor Gordon at the great Worthwhile Canadian Initiative comments on economic inequality in Canada:

The increase in the income share at the top end of the top end of the distribution is sometimes expressed in terms of the gains in the top quintile (top 20%) or the top decile (top 10%). But it’s clear that the real winners since 1985 are those above the 99th percentile. And even within this group, the gains are concentrated at the upper end:

Income growth by percentile

It’s important to make this distinction. As a matter of statistics, it’s perfectly true that people who are in (say) the top 10% have received the lion’s share of gains to national income. But people who are at the 90th or even the 95th percentile could fairly object to such a broad brush, because they – like the people at the median – haven’t seen much in the way of increases in income either. So when you talk about ‘the rich’, it’s important to restrict attention to those making (say) $400-500k/yr or more.

Everything Prof. Gordon says is true.  All the action in economic inequality in recent decades has been at the very, very top of the distribution, and it’s important to keep that fact in mind when discussing inequality.  However, the other side of that same coin is the inaction in the rest of the distribution.  Look at this graph from Osberg (2008):

Real wageI think this is one of the most striking images in empirical economics.  (It includes the total value of all hourly compensation.)  It emphasizes two points.  First, the trend toward a more unequal income distribution since approximately 1980 is not merely a matter of the super high-income jobs (financiers, professional athletes and star entertainers) experiencing very fast wage growth.  It is an essentially distributional issue; around 1980 there was some significant change in the mechanisms by which national income is distributed across the labour force, a change that caused the income gains for some jobs to rise dramatically and the gains of the rest of the economy to more or less end. Second, this change in society’s distributional mechanisms was a radical departure from the pre-1980 economic regime.

It’s important to take note of both aspects of the economic inequality trend — the rise in incomes in the top percentile, and the stagnation of the rest of the distribution — because they both have implications for how we should address the issue.  I think that the stagnation of real income for most of the income distribution is a more significant problem than the mere fact that measured inequality is rising due to gains at the top end.  To solve this, we have to figure out exactly what changed around 1980 to cause the sudden shift in the distribution mechanisms in Canada (and the US and much of Europe).  Meanwhile, the fact that most of the income gains are going to a very small part of the population might limit the degree to which redistributive taxation can reduce inequality.  The super-rich tax base is pretty small, so extracting a lot of revenue from them would require very high tax rates that would probably have significant incentive effects.  In particular, it would make Canada even less attractive (relative to the US) for high earners.

Written by Alex

August 16, 2009 at 8:57 pm

Posted in Uncategorized

Tagged with

Interesting game theory application

with one comment

Bruce Bueno de Mesquita predicts Iran’s future at TED (if you don’t know TED look it up). Very cool, and I like that PoliSci people are getting more rigorous. My only quibble with his presentation is when, make the case for his accuracy, he says “I got it 93% of the time when the experts got it wrong”. Well, did you also get it right when they did?

Written by Brian Quistorff

August 16, 2009 at 8:31 pm

Posted in Uncategorized

Stimulus as a change in the discount factor

with 2 comments

Quick thought: Simple government cost-benefit analysis often involves discounting future benefits using a discount rate and seeing if they outweigh the current costs. The problem is that one’s recommendation often depends on value of the discount rate. Certain projects, such as climate amelioration and US metrification, would only be undertaken with a discount rates close to 1 (equating future outcomes with present ones). I could imagine an economically-minded government that, in normal times, uses a particular discount rate, but that during times of recession (when stimulus is needed) uses a different rate closer to 1 so as to initiate more, but still arguably worthwhile,  projects.

Written by Brian Quistorff

August 15, 2009 at 10:39 am

Posted in Uncategorized

Theoretical 4th Ro(e)mer found!

leave a comment »

  1. Christina Romer (Berkeley)- chair of the US Council of Economic Advisers
  2. David Romer (Berkeley) – married to the above and author of our Macro book
  3. Paul Romer (Stanford) – developed the endogenous tech change model we studied in Macro
  4. John Roemer (Yale) – PoliSci & Econ guy who used to be a Marxist

And to think only a year ago, I thought there were only two.

Written by Brian Quistorff

August 15, 2009 at 9:57 am

Posted in Uncategorized