My Subscriptions

April 14, 2014

The Gospel Coalition Blog

A Bubba With a Passion for the Gospel and Golf

The Story: On Sunday Bubba Watson, one of the most untraditional golfers on the PGA Tour, was the winner of the 2014 Masters Tournament. But golf isn't Watson's top priority. What he considers most important can be gleaned from the description on his Twitter account, @bubbawatson ("Christian. Husband. Daddy. Pro Golfer.") and his website, ("Loves Jesus and loves sharing his faith").

watson_610_masters14_d4_scott_jacketThe Background: In an interview with Trevor Freeze of the Billy Graham Evangelistic Association, Watson tells how he uses his Twitter account—along with his PGA platform—to share about his faith in Christ.

"For me, it's just showing the Light," said Watson. "There's people who want to put down Christians. I try to tell them Jesus loves you. It's just a way to be strong in my faith."

After his first Master's win in 2012 Watson's Tweeted: "The most important thing in my life? Answer after I golf 18 holes with @JustinRose99. #Godisgood." Later that day he posted on his account, "Most important things in my life- 1. God 2. Wife 3. Family 4. Helping others 5. Golf"

"Lecrae said it the best," Watson said of the Christian rapper he listens to on his iPod. "He doesn't want to be a celebrity. He doesn't want to be a superstar. He just wants to be the middle man for you to see God through him."

Why It Matters: Christians have always been involved in professional sports, so why is the faith of superstars like Watson suddenly worthy of the public's attention? Because athletes like Watson show that it's still possible for athletes to be open and unapologetic about their willingness to share the Gospel. Also, Watson may be one of the best in his sport but he understands the importance of  keeping his priorities in order, winsomely admitting that their life's callings are secondary to serving the Creator who has called them. To a culture that is both obsessed and disillusioned with fame and fortune, this centered perspective provides a refreshingly countercultural witness.

by Joe Carter at April 14, 2014 01:19 PM


Fortnightly Book, April 13

It might well be a busy two weeks -- I'm already behind on quite a few things -- so I need a re-read or something relatively unchallenging. So I've decided to go with something I haven't read for some years.

You might remember from the latter part of Little Women some of the story of Jo's struggles to write. One of the more vivid parts of that story consists of Jo writing 'potboilers' for newspapers, which intersects with her early interactions with Professor Bhaer; he provokes and encourages her to think of herself as capable of more, one of the signs that he is very good for her. Alcott knew something of potboiler-writing, both why one would do it and why it might drag one down, because she had by that point been a potboiler-writer, usually under the pen-name "A. M. Barnard", for quite some time. The fortnightly book is one of these potboilers: A Long Fatal Love Chase.

Alcott had been asked by her publisher to write a sensational work of twenty-four chapters in which the end of every second chapter introduces some hook to keep the reader reading. She wrote A Modern Mephistopheles: or The Fatal Love Chase in two months, drawing on her recent year-long trip to Europe. It was rejected, however, as too long and too sensational. It certainly does hit all the marks for sensationalism for the day, whether it be bigamy or suicide or a handsome Catholic priest who isn't a villain. Alcott worked on revising it, eventually using the main title for a completely different work, one which is essentially a retelling of Faust, The book was never published, however, until Kent Bicknell published it in 1995, in its pre-revision form, as A Long Fatal Love Chase. I picked it up shortly after it came out; it was a quite vigorous story, and it will be interesting to reflect on it here in two weeks' time.

One often finds people contrasting the work with Little Women and commenting on its strong, independent heroine. I think this is a point on which contemporary values end up distorting the reading, somewhat as if one were to read Pride and Prejudice and conclude that Lydia is the strong, independent woman rather than Elizabeth. Rosamond, the main character of A Long Fatal Love Chase, is on practically every score weaker and more dependent than the March sisters. She is strong-willed, yes, but her primary free choice consists of putting herself entirely into the power of a very dangerous man, a situation from which she stands no chance of extricating herself without the help of very brave men. Her 'year of freedom' is the Faustian bargain intimated by the repeated echoes of Goethe throughout the work: she receives nothing from Phillip Tempest but an illusory freedom and status as a pet and a toy. Her entire story is of moving from depending on one man to depending on another. For all that, she is a vividly written character in an interesting story, in which she learns the importance of "the serenity of a true heart strong to love, patient to wait" (p. 346). It is love and patience, however, not the impetuosity and self-will, that holds the key to strength and independence, and it is learning it, however slowly and tragically, that makes Rosamond stand out from legions of sensational women characters coming to tragic ends.

If it turns out that this next two weeks is much less busy than I'm expecting, I'll add Alcott's Faust retelling, A Modern Mephistopheles, to this one. But I'm not promising anything on that score.


Louisa May Alcott, A Long Fatal Love Chase. Dell (New York: 1995).

by Brandon ( at April 14, 2014 12:53 PM

The Finance Buff

Find A Financial Advisor Outside Your Local Area

While looking for a feature photo for my articles on using a financial advisor, I only found ones like this:

Senior Adult Couple Going Over Papers in Their Home with Agent

The pictures show either a couple sitting across the table at the advisor’s office or the advisor coming to their home. The common phrase for working with an advisor is “sit down with an advisor.” Obviously it’s not saying you shouldn’t be standing up; it implies you should work with someone local.

The popular lead-generation sites for fee-only advisors — NAPFA, FPA, Garrett Planning Network — all lead the search by your zip code, because most people look for one local to them, as people do when they look for a doctor, or for that matter, a car mechanic or a plumber.

Besides the mom-and-pop fee-only advisors found on NAPFA, FPA, or Garrett Planning Network, who else are local?

You have your investment person sitting in a bank or credit union branch. You have people working at Fidelity or Charles Schwab branches. You have people working for Ameriprise, Edward Jones, Merrill Lynch, Morgan Stanley, UBS, Wells Fargo, and a number of other companies. People representing insurance companies such as Northwestern Mutual and New York Life offer investments too.

Why are they local? Because people believe their advisor must be local.

Is it really necessary to have your advisor local to you? The advisors I listed in my article The Average Investor Should Use An Investment Advisor: How to Find One are located in different parts of the country. I didn’t even mention where exactly they are.

What makes an advisor local to you better than an advisor far away? Nothing. Unlike a doctor, an advisor doesn’t have to touch you in order to figure out what’s best for you. Good advice is good advice whether it comes near or far. I can understand people want a local bank branch to make deposits and withdrawals, but trades are executed online anyway.

At work, I routinely work with colleagues two time zones away. We exchange emails and documents. We go on conference bridge lines, often multiple times a day. Sometimes we talk to people on another continent via video conference. We get quality work done.

When I had the consultation with the Vanguard advisor, I didn’t care where she was. I spoke to her by phone and via video conference. She answered my questions just as well as she could if I were sitting in her office.

We all tend to think our own questions, problems, and challenges are unique when in fact they are actually very common. Even though the Vanguard advisor didn’t have 20 years of experience in finance and investing, her answers were on target. My questions didn’t require 20 years of experience to answer. As a salaried advisor getting clients handed to her day in and day out, she must have a lot of practice, probably way more than the independent CFPs who spend more time in prospecting.

If you limit yourself to someone local, you will miss the good advisors in terms of both quality and cost. Local people have better skills and more practice in sales and marketing, not necessarily in advice. If you just look around, or you answer calls from local people, or you take referrals from friends or family, you more likely will find a salesperson, if only just by the law of numbers.

Look beyond your local area for your financial advisor.

[Photo credit: Flickr user SalFalko]

See All Your Accounts In One Place

Track your net worth, asset allocation, and portfolio performance with FREE financial tools from Personal Capital.

Find A Financial Advisor Outside Your Local Area is copyrighted material from The Finance Buff. All rights reserved. ( b87e8215d24496480249d6aaf20c77ea )

by Harry Sit at April 14, 2014 12:25 PM

Parchment and Pen

Overturning Tables


Matthew 21:12-13 And Jesus entered the temple and drove out all who sold and bought in the temple, and he overturned the tables of the moneychangers and the seats of those who sold pigeons. He said to them, “It is written, ‘My house shall be called a house of prayer’ but you make it a den of robbers.”

This is the last week of Jesus’ life on earth.  Every moment of this week drips with meaning and passion.  Jesus will stay in the immediate area of Jerusalem for the next 5 days until, “It is finished.”

How does our Creator kick-off His last week on earth? Jesus has every right to spend this day relaxing.  He could focus on saving His energy for the cross.  Jesus, however, has something more pressing than His own comfort.  He wants to give us all a very clear message.  Jesus pays the temple a visit.

For centuries people came near to God at the temple. In order to make a sacrifice for your sins you needed to give the priest a lamb. The innocent lamb would then symbolically take your sin.  The blood of the lamb, as it is being sacrificed, would pay for your sin.  If you were too poor to afford a lamb, you could purchase a pigeon.

As Jesus walked into the temple on this day he was filled with righteous anger.  Jesus was furious with the jacked-up religious nonsense happening at the temple. People were not allowed to bring their own animals for sacrifice.  The animals had to be purchased by the people running the temple. The animals were much more expensive at the temple than in the countryside. In addition to the animal prices being super high, the temple only allowed you to buy the animals with temple money. How did you get temple money?  Well, you had to exchange your perfectly good regular money for temple money.  Converting your money to temple money, of course, came with additional transaction fees.  People had to pay extra money to then buy the extra expensive animals.  The poorest of people were being robbed as they simply desired for God to forgive their sins.  This didn’t sit well with Jesus.

The Height of Love, Jesus, spent the beginning of this week taking care of business. He entered the temple and let his actions do the talking. The One who originally designed the temple now drove the crooks away.  He flipped over their tables. He flung the temple money as far away as possible.  Even the lowest of people will have full access to God. Jesus picks a fight with anyone keeping people from God.

Jesus makes the statement, “It is written, ‘My house shall be called a house of prayer’ but you make it a den of robbers.”  Jesus wants prayer.  He wants us speaking with Him.  He wants us near to Him.  He doesn’t want to take our money.  He wants our presence.  Our attention.  Our affections.

At the beginning of this Holy Week it is very clear Jesus is not here to keep things business as usual.  Jesus is communicating a very clear message. He will not stand by as religious systems are set up to make it nearly impossible to come to God.  At the end of this week Jesus will purchase our souls for God.  Jesus overturns the tables because He is a jealous lover. He is a relentless pursuer.

Jesus made a fool of himself but he didn’t care.  He did whatever was necessary to remove what was keeping people from God.  He will continue this week to do whatever is necessary to bring people nearer to God.  Jesus could have easily looked at our sin and simply walked away.  What height of love do we see as Jesus is driving out the robbers? Drink this in.  Let it change you.  Simmer in this truth.  We witness an incredible height of love as we see Jesus driving away anything keeping us from Him.

What is keeping you from drawing nearer to your God today?  Pray.  If your life were laid out on several tables would Jesus come and overturn any of them?  If so, repent.  Let the Spirit show you tables in your own life for Jesus to turn over.  Be convicted of anything Jesus needs to overturn in your life then let Him do it.  He is in control. He is good.  He loves you. He really loves you. He refuses to let you stay apart from Him.

Let the words of Psalm 139:23 consume your thoughts today, “Search me, O God, and know my heart! Try me and know my thoughts!Similar Posts:

by Tim Kimberley at April 14, 2014 12:19 PM

One Thing Well



Spiped (pronounced “ess-pipe-dee”) is a utility for creating symmetrically encrypted and authenticated pipes between socket addresses, so that one may connect to one address (e.g., a UNIX socket on localhost) and transparently have a connection established to another address (e.g., a UNIX socket on a different system). This is similar to ‘ssh -L’ functionality, but does not use SSH and requires a pre-shared symmetric key.

April 14, 2014 12:00 PM

Text Patterns

smileys, emoticons, typewriter art

I hate to be a party pooper — no, really: I hate it — but I just don't think Levi Stahl has found an emoticon in a seventeenth-century poem — nor, for that matter, that Jennifer 8. Lee found one from 1862.

About Stahl and Robert Herrick. If we were really serious about finding out whether Robert Herrick had used an emoticon, we’d look for his manuscripts — since we could never be sure that his printers had carried out his wishes accurately, especially in those days of highly variable printing practices. But those manuscripts, I think, are not available.

The next step would be to look online for a facsimile of the first, or at least a very early, edition, and while Google Books has just such a thing, it is not searchable. So, being the lazy guy that I am, I looked for nineteenth-century editions, and in the one I came across, there are no parentheses and hence no emoticon:

So it’s possible, I’d say likely, that the parenthesis in the poem was inserted by a modern editor. Not that parentheses weren’t used in verse in Herrick’s time — they were — but not as widely as we use them today and not in the same situations. Punctuation in general was unsettled in the seventeenth century — as unsettled as spelling: Shakespeare spelled his own name several different ways — and there were no generally accepted rules. Herrick was unlikely to have consistent punctuational practices himself, and even if he did he couldn't expect either his printers or his readers to share them.

So more generally, I think Stahl’s guess is ahistorical. The first emoticons seem to have been invented about thirty years ago, and are clearly the artifact of the computer age, or, more specifically, a purely digital or screen-based typewriting-only environment — because if you were printing something out before sending it, you could just grab a pen and draw a perfectly legible, friendly, not-rotated-90-degrees smiley, or frowney, or whatever, as people still do. Emoticons arose to address a problem that did not and does not exist in a paper-centric world.

And one final note: in the age between the invention of the typewriter and the transition to digital text, people certainly realized that type could make images — but they were rather more ambitious about it.

by Alan Jacobs ( at April 14, 2014 11:59 AM

The Tech Report - News

Report: NSA knew about Heartbleed bug for ''at least two years''

Before it was patched and publicly disclosed a week ago, the Heartbleed bug lived inside OpenSSL for about two years. Amid last week's server-patching, password-resetting frenzy, some folks wondered whether hackers had known about the bug—and used it to pilfer people's personal data—before everybody else.

Well, wouldn't you know it, there's reason to think the NSA did just that. A recent story by Bloomberg quotes two unnamed sources "familiar with the matter" who claim the NSA has known about the Heartbleed bug for "at least ...


April 14, 2014 11:00 AM

Kevin DeYoung

Monday Morning Humor

I hope your pageant turns out better than this one.

And then there’s this one–the actions starts around 1:15.

by Kevin DeYoung at April 14, 2014 10:34 AM

Cool Tools

Eco Brick

I love heating my house with my fireplace and wood stove. Its carbon neutral, it targets the heat where I want it, and somehow it just feels warmer then forced air heat.

I don’t love dealing with firewood. I don’t like storing it, trying to keep it dry, and I especially don’t like going outside when it is freezing cold to bring an armful inside.

That’s where Eco Bricks come in. They are compressed hardwood sawdust bricks that you burn in a fireplace just like logs. They are kiln dried and bug free, so they can be stored inside. Since they are kiln dried, they always light easily.

BTU wise, the company says that a pallet of Eco Bricks are equivalent to a full cord of hardwood firewood. Where I am, a pallet runs $235, which is roughly the same as a cord of firewood.

Since these things are so dense and dry, some care must be taken not to over-fire your fireplace or stove. I’ve been using them for three winters, and haven’t had any problems yet.

I’ve got about a half pallet in my basement queued up. I’m looking forward to my first fire of the season.

-- Clark Case

Eco-Brick pressed sawdust fireplace fuel
Find a local retailer here

Sample Excerpts:

[Enjoy this video of a one-hour Eco Brick burn. - Mark]

by mark at April 14, 2014 09:00 AM


Chrysologus for Lent XL

Pray, brothers, that the angel may descend now and roll away all hardness from our heart, and remove the barriers to our understanding, and show that Christ has also risen out of our mental limitations, since just as that heart is heaven in which Christ lives and reigns, so too that breast is a tomb in which Christ is still held to be dead and buried. Just as we believe that Christ's death occurred, so too must we believe that it is entirely a thing of the past. Christ as Man suffered, died, and was buried; he is, lives, reigns, continues, and remains forever as God.

Sermon 75, section 4.

by Brandon ( at April 14, 2014 08:30 AM

Table Titans

Tales: With Ultimate Power


I was running a low-ish level 4e D&D game for some friends at my local game shop and the campaign was straight out of my brand new Book of Vile Darkness adventure book.

The party was lost in a thick patch of forest and had just stumbled into a clearing created by something massive having fallen…

Read more

April 14, 2014 07:03 AM

Bestiary: The Book of Vile Darkness


The word ‘evil’ is not quite strong enough to describe the horrors that lie within the Book of Vile Darkness. Add to it a touch of devastation, toxicity, cruelty, and terror... and you finally come close to what is in store within the pages.

Here are excerpts about The Book of Vile Darkness…

Read more

April 14, 2014 07:00 AM

Beeminder Blog

Heartbleed and Other Epic Crashes of Ineptitude

Placeholder for image of the infinibee with a bleeding heart

We’ll start with the non-nerd version. Last week there was a massive security breach in some very standard software used by most sites on the internet, including Beeminder. Let us first quickly reassure you that your credit card info gets secured by our much more savvy payment processor, Stripe, and actually never even touches our servers.

We updated our server the same day the vulnerability was announced, thanks to our friend Joe Rayhawk, but, being really awful sysadmins ourselves, failed to notice we needed to restart the server to make it take effect. We did that the next morning when users started letting us know we were still vulnerable (thank you!). But then our server failed to boot back up normally and we spent a couple frantic hours, with help again from Joe, getting it operational.

Then four days later, presumably wholly unrelated, there was a power outage that took out the whole data center where Beeminder is hosted. That lasted for 3.5 hours.

In other embarrassing server-related news, there were a couple days last week when many of you were letting us know that you were seeing a lot of your Beeminder emails marked as spam. We asked our email service provider, MailGun, for help and they quickly fixed it. Definitely let us know if you ever see a Beeminder email marked as spam. That could be pretty devastating for us. Of course we always reverse derailments that were caused by any kind of technicality like that.

So this has not been a happy week at the beehive. We were looking back over previous blog posts about our crashes of ineptitude and see that we failed to blog about our biggest crash of ineptitude of all time, as least as measured in downtime (there are some other doozies in those previous blog posts — like nearly destroying our database and causing spurious derailments). That was in November when we had a completely unacceptable 12 hours of downtime as we scrambled to recover after a hard drive failure, wasting precious hours thinking we had the original server repaired only to finally realize we had no choice but to start from scratch on a fresh server. (Bright side: zero data loss!)

Suffice it to say, we are hacking our guts out making things more robust. In the short term that may mean more downtime (as brief as possible!) which we’ll continue to live-tweet. Anything major we’ll first announce on our main twitter account, @bmndr, and point to @beemstat for details. (So no need to follow both of them.)

We also want to say how sorry we are about all this. Especially since the downtime in November should really have been sufficient kick in the pants to improve our infrastructure, at the very least being able to quickly redeploy from scratch on a new server in case of total meltdown at wherever Beeminder happens to be hosted. And we do appreciate how much you rely on us. In fact, Bethany’s reaction to our most recent downtime was pretty funny. At some point there was nothing we could do but wait for our hosting provider to restore power and Bethany said, with zero intended irony, “I guess I’ll check my beeminders to see what else I need to do tonight while we’re wait- Oh. Right.” [1]

——— Non-Nerds Stop Reading Here ———

For more of the technical details, I’ll turn this over to our fearless CTO, Bethany Soule.


The IP address our mail was coming from had gotten blacklisted, presumably due to someone else’s shenanigans. MailGun switched our IP and it seems to have solved the problem.

Power Outage

Saturday night a meteor full of zombies hit Newark and they gnawed through intertubes connecting our servers to the world wide web!

That didn’t really happen. But Linode’s entire Newark data center had a power outage and apparently a UPS failure along with it. Because power was totally out to the entire data center, and because of our current deployment architecture (ha! that’s just a fancy way to say “We’re running all of Beeminder’s components off the same physical (virtual) machine”) there was little we could do to recover without losing data back to our latest backup, which was over 12 hours old.


Our servers run Ubuntu Linux and indeed were vulnerable to the Heartbleed bug for however long it’s been out there in the wild. We patched our libraries within several hours of Ubuntu releasing their updated packages, but failed to restart nginx to reload the updated libraries. We went to bed blissful in our ignorance of our ignorance. When you guys started emailing us in the morning we were no longer ignorant of our ignorance, but we were however still ignorant of what in the world was wrong with our filesystem after updating the packages the day before. The server is running, and not vulnerable to heartbleed, but it’s somewhat hobbled at the moment. Which brings me to my next point.

Getting better

Aside from the immediate clean-up from this latest failure, we are (with help from our savvier friends) making Beeminder robust at the very least to experiencing this exact failure again. We’re hard at work writing and testing an actual deploy script for setting up a new copy of the Beeminder web server seamlessly and much more quickly than previously. We’re also setting up database redundancy using Mongo’s replica sets so that if our server’s datacenter goes AWOL we can quickly fail over to the replica.



[1] Actually, thanks to the Android app’s robustness — normally useful when your phone doesn’t have data coverage — it was actually no problem to see what beemergencies we had left. One of them, for Bethany, was running, which the two of us did together at 2:30am after the server chaos settled down.

by dreeves at April 14, 2014 06:57 AM


Five Thoughts on #T4G 2014

t4gWith thousands of others from across the country, and indeed, world, this last week I had the privilege of attending the 2013 Together for the Gospel conference in Louisville, Kentucky. Far too much happened for me to adequately give an account for it all. Still, I had a few brief reflections on my experience I figured were worth sharing:

  1. Hospitality and Generosity - I only made it to T4G because of the generosity of others. I couldn’t have afforded it myself. From my friends on twitter lobbying to get me to the conference, to my gracious benefactor providing the ticket, my parents helping with airfare, and good friends giving me lodging, every single bit of this trip was due to the gracious giving of others. Along that same line, I was deeply struck by the hospitality of friends, in particular that of my hosts, the Clarks. Richard (my editor at Christ and Pop Culture) and his wonderful wife Jen put me up–and put up with me–for the whole of the conference, providing me with lodging, rides, and the warmth of their care. All of this without us ever having met in real life! I told them a number of times, either I have really low standards of hospitality, or they are champs at it. The entire experience left me with a deep, concrete picture of our generous, hospitable God who gives abundantly and makes undeserving sinners welcome in his home.
  2. New York Calvinists – I find I tend to live a parochial existence in my head. As much as I might affirm the existence of a global church where every tribe, tongue, and nation will one day (and even now) worships King Jesus, I don’t think I have a thick, lived sense of it most of the time. This is why it was such a delight to have the opportunity to meet, if only briefly, brothers and sisters serving, preaching, and teaching the same gospel all around the nation. I think of one brother I talked with briefly, serving young adults in a difficult area of Baltimore. Or again, of the pastors from Albany I ran into, talking in thick New York accents in the airport terminal about the love and wrath displayed in the cross. Or finally, my brother Johnny from New Jersey, serving youth in Detroit, who prayed with me for my college students as I was away from them on Thursday. God-centered ministry is happening in sorts of places that it never occurs for us to think of as centers of gospel-work.
  3. Hey, I Follow You on Twitter – Following off that point, I met a bunch of people I follow on Twitter (and occasionally, those who follow me.) I think I noted this last year after the TGC conference, but it’s lovely to find out that the people you see tweeting and blogging all of this encouraging material actually believe it and are living it out. Beyond that, fellowshipping in the flesh with them made me realize both the blessings and the limitations of technology. I love that I know, laugh with, and am stirred up to service by so many that I know only through social media. That said, being in the same place, able to shake hands, embrace, and grasp hands in prayer made me keenly aware of the blessing of physical presence. As I think of the new friends I’ve made, and older friendships deepened, I begin to feel the weight of Paul’s longing to commune and worship with his brothers and sisters he can only write to and pray for in a new way.
  4. Evangelism is Awkward – So, the conference topic was evangelism and I have to say it was convicting and encouraging. I got on the plane Friday morning looking for new ways to engage my fellow passengers, or fellow travelers in the airport with the gospel, and you know what? I didn’t really get to. I mean, I’d strike up conversations, keen to look for opportunities to mention the gospel, and try as I might, I hit wall after wall. I don’t know if it was that I wasn’t bold enough, prayerful enough, or these were particularly difficult crowds (I mean, once people find out you’re a pastor, things either open up or shut down fast), but it just didn’t go anywhere. Why do I share this? Shouldn’t I wait until I have a nice little story with a bow on it about converting the atheist or the Muslim in the seat next to me? Maybe, but we need to be prepared to hit some difficulties along the road when it comes to sharing the gospel. It’s easy to get discouraged by one or two failed encounters and stop trying to find ways of sharing the news of Jesus. It’s also simple to fall into the trap of thinking this sort of thing just happens naturally and easily for pastors. It doesn’t. We have to work on it too. But remember that God is at work even in our “failed” attempts, working in our own hearts and lives, preparing us for greater service in his kingdom. God is a father who is pleased even with our stumbling efforts in his name.
  5. We Don’t Really Want What We Pray For – Finally, I’m once again reminded of God’s sense of humor. I rarely miss a college group, or am missing for it, so I tend to get a bit anxious the few times I have been away. This week was no different. Though I had my very trustworthy and capable buddy covering for me, great volunteers, and a pretty normal week, I was still kind of worried. That night, though, I prayed with a friend that God would show me that he could glorify himself in the group without me—that he remind me of my essential unnecessariness (not sure that’s a word) in his works. Well, about an hour later I call and check with my wife who tells me the group packed, there are new people, things are bumping, and my first reaction is to think, “Oh great, the one week I’m not there to run things…” Then the thought struck me, “Isn’t this what you prayed for? For things to go smoothly without you? For God to show you he’s perfectly capable of handling things without you there?” And that’s when I was reminded of the reality that so often I don’t actually want the sanctification I pray for. I pray for patience and resent the situations that build it. I pray for compassion and try to harden my heart to opportunities to demonstrate it. Thank God that in his faithfulness, he answers according to our actual needs, not our whims.

As always, there’s more to say, but I’ll cap it there. All in all, the conference was another good gift from God’s hands whose blessings I can’t begin to number.

Soli Deo Gloria

by Derek Rishmawy at April 14, 2014 05:39 AM

The Gospel Coalition Blog

Cultural Engagement that Avoids Triumphalism and Accommodation

Greg Forster's important and practical new book helps Christians think out how to engage culture. Many would say this is not a proper goal for believers, but that is a mistake.

51mjgUdKvBLActs 17 records Paul's famous visit to Athens, the academic center of the Roman Empire of the day. One commenter likened the intellectual power of Athens at the time to all the Ivy League schools as well as Oxford and Cambridge universities all rolled into one. Though Paul was repulsed by the idolatry he saw there, he did not turn away from the city in disgust. Instead, he plunged into the marketplace, the agora, where we are told he daily "reasoned" with those he found there about the gospel. Now when you or I think of a "marketplace," we think of shopping and retail. Of course the agoras of ancient cities contained that, but they were much more. The agora was the media center—the only place to learn the news at a time before newspapers and other technological media. It was also the financial center where investors connected with businesses. It was the art center as well, the place where so much art was performed. It was the place where new political and philosophical ideas were debated. In short, the agora was the cultural center of any city. And since this was Athens—which along with Rome had the most influence of all cities—it could be said to be part of the cultural center of the Greco-Roman world. The ideas forged and accepted here flowed out and shaped the way the rest of society thought and lived.

It is instructive, then, to see that Paul takes the gospel literally into the public square. It means that he did not see the Christian faith as only able to change individual hearts. He believed that the gospel had what it took to engage the thinking public, the cultural elites, and to challenge the dominant cultural ideas of the day. He was after converts of course—he was first and foremost a church planter, not a theologian or Christian philosopher. But he wouldn't have been able to engage the hearts of cultural leaders unless he also engaged the ideas of the culture itself. He did not shrink from that challenge. He did not merely try to find individual philosophers to evangelize in a corner. He addressed them as a culture, a public community.

It is often missed that, although later Paul was invited to give an address, he did not start by preaching in the agora. He did not get up on a soapbox and merely declare what the Bible said. It says Paul "reasoned" (Acts 17:17) in the marketplace, using a word—dialegomai—that sounds like "dialogue." However, as John Stott says in his commentary on Acts, this term probably denoted something more specific than we would think of today when we hear it. Stott says it was something closer to what we might call the Socratic method. This was not a "debate" as we see debates today, where two parties read off talking points at one another. It required lots of careful listening, and in particular it meant asking questions that showed that your opponents were self-contradictory, that is, they were wrong on the basis of their own premises. And indeed, when we actually hear Paul's address to the philosophers in Acts 17:22-31, we can't help but notice that he does the Socratic method even here. He does not expound or even quote Scripture, but rather quotes their own thinkers (v. 28) and then shows them that, on the basis of their own intuitions and statements about God, idolatry is absolutely wrong (v. 29). Many have pointed out how Paul's address lays the foundation for a doctrine of God, contrasting the contemporary culture's beliefs in multiple, fallible, powerful beings who must be appeased with the idea of one supreme Creator, sovereign God who is worthy of awe-filled adoration and worship. Every part of what Paul says is deeply biblical, but he never quotes the Bible; instead he shows them the weakness and inadequacies of their own views of the divine and lifts up the true God for their admiration. He appeals as much to their rationality and their imaginations as to their will and hearts.

What It Is and It Not

The term "cultural engagement" is so often used by Christians today without a great deal of definition. This account of Paul and Athens gets us a bit closer to understanding what it is by showing us what it is not. Christians are to enter the various public spheres—working in finance, the media, the arts. But there we are neither to simply preach at people nor are we to hide our faith, keeping it private and safe from contradiction. Rather, we are as believers to both listen to and also challenge dominant cultural ideas, respectfully yet pointedly, in both our speech and our example.

When Paul addresses the Areopagus, a body of the elite philosophers and aristocrats of Athens, he was, quite literally, speaking to the cultural elites. Their response to him was cool to say the least. They "mocked" him (Acts 17:32) and called him a "babbler" (v. 18), and only one member of that august body converted (v. 34). The elites laughed at him, wondering how Paul expected anyone to believe such rubbish. The irony of the situation is evident as we look back at this incident from the vantage point of the present day. We know that a couple of centuries later the older pagan consensus was falling apart and Christianity was growing rapidly. All the ideas that the philosophers thought so incredible were adopted by growing masses of people. Finally those sneering cultural elites were gone, and many Christian truths became dominant cultural ideas.

Why? Historians look back and perceive that the seemingly impregnable ancient pagan consensus had a soft underbelly. For example, the approach to suffering taken by the Stoics—its call to detach your heart from things here and thereby control your emotions—was harsh and did not work for much of the populace. The Epicureans' call to live life for pleasure and happiness left people empty and lonely. The Stoics' insistence that the Logos—the order of meaning behind the universe—could be perceived through philosophic contemplation was elitist, only for the highly educated. The revolutionary Christian teaching was, however, that there was indeed a meaning and moral order behind the universe that must be discovered, but this Logos was not a set of abstract principles. Rather it was a person, the Creator and Savior Jesus Christ, who could be known personally. This salvation and consolation was available to all, and it was available in a way that did not just engage the reason but also the heart and the whole person. The crazy Christian gospel, so sneered at by the cultural elites that day, eventually showed forth its spiritual power to change lives and its cultural power to shape societies. Christianity met the populace's needs and answered their questions. The dominant culture could not. And so the gospel multiplied.

Do we have Paul's courage, wisdom, skill, balance, and love to do the same thing today in the face of many sneering cultural leaders? It won't be the same journey, because we live in a post-Christian Western society that has smuggled in many values gotten from the Bible but now unacknowledged as such. Late modern culture is not nearly as brutal as pagan culture. So the challenges are different, but we must still, I think, plunge into the agora as Paul did.

Greg Forster's new book does a marvelous job of showing us a way forward that fits in with Paul's basic stance—not just preaching at people, but not hiding or withdrawing either. Within these pages, believers will get lots of ideas about how to "reason" with people in the public square about the faith and how to engage culture in a way that avoids triumphalism, accommodation, or withdrawal. Paul felt real revulsion at the idolatry of Athens—yet that didn't prevent him from responding to the pagan philosophers with love and respect, plus a steely insistence on being heard. This book will help you respond to our cultural moment in the same way.


This article was adapted from the foreword to Joy for the World: How Christianity Lost Its Cultural Influence and Can Begin Rebuilding it (Crossway, 2014), by Greg Forster. This book is the second installment of the Cultural Renewal series edited by Tim Keller and Collin Hansen.

by Tim Keller at April 14, 2014 05:02 AM

Three Kinds of Shame

Sin is muddy. When it splashes, we rightly want to clean it up. But sometimes our zeal to clean causes us to oversimplify sin's muddiness by seeking trite answers for complex situations.

Consider the example of Jesus healing the blind man in John 9. This man had spent his entire life in darkness, and his misery had no comfort. His blindness brought shame. He couldn't get a job or volunteer in God's temple. All he could do was sit and beg.

mud handJesus' disciples asked a reasonable question: "Who sinned, this man or his parents, that he was born blind" (John 9:2)? In other words: Is he responsible, or are his parents responsible?

The disciples knew that shame is no accident, but unfortunately they knew of only two possible causes: immorality or abuse. And while we know Jesus will present a third perspective, let's consider these two options the disciples presented.

Shame #1: My Sin Against God

In this case, I did something I'm ashamed of, and I should be ashamed of it. There's a standard of right and wrong, moral and immoral, kindness and cruelty—and I broke it.

This is the shame of immorality. In moments of clarity we're horrified by our ability to be horrible. We've lied to people who trust us. We've ridiculed others to get a good laugh. We didn't wait for marriage, or we selfishly destroyed what could have been a sweet honeymoon. We've aborted our babies. We've touched people—perhaps even children—in ways they didn't want to be touched. We touch ourselves often, and we don't want to stop.

Jesus acknowledged that suffering and shame are sometimes caused by our own sin (John 5:14). But though we are blind, Jesus sees us. He wants to cleanse our mud.

Shame #2: Others' Sin Against Me

In this case, someone else did something to me and that person should be ashamed of it. There's a standard of right and wrong, moral and immoral, kindness and cruelty—and he or she broke it. But I'm stuck with the shame of it.

This is the shame of abuse. Do you replay the memories and wonder if you're a horrible person? Perhaps your best friend lied to you or betrayed your confidence. Perhaps you were the ridiculed outcast. Perhaps your dream date or honeymoon became a nightmare when your lover lost control. Perhaps you felt manipulated into getting an abortion. Or someone touched you where you didn't want to be touched. Maybe you even trusted that person—everybody trusted that person. When you told people about it, they didn't believe you.

Jesus acknowledged that innocent people sometimes suffer under the hand of evil (Luke 13:16). But though we are blind, Jesus sees us. He wants to cleanse our mud.

Shame #3: The Work of God in Me

Now we get to the blind man's true shame. "It was not that this man sinned, or his parents, but that the works of God might be displayed in him" (John 9:3).

Sometimes this is the most difficult kind of shame, because it seems to serve no purpose. There's nobody to blame for it but God, but you and I still have to bear the weight of it.

Do you carry the shame of being different, such as a physical deformity or speech impediment? Maybe people think you're not as pretty as the other ladies, or not as strong as the other guys. Maybe you feel attracted to people you know you shouldn't be attracted to. Perhaps you're too tall, too short, too clumsy, too geeky, too stupid, or too awkward.

Though we are blind, Jesus sees us. He wants to cleanse our mud and so work the works of God in us.

How to Minister to Shame

All three kinds of shame surround us. They fill our neighborhoods and our churches.

Foolish counselors and teachers assume all shame falls in only one category. For Job's miserable helpers, everything fell into the first category (your sin against God). Today, the spirit of the age puts everything into the second category (others' sin against you). We who are Calvinists sometimes overreact by placing everything into the third category (the work of God in you).

Wise counselors and teachers recognize shame's complexity, and they seek to understand the mud before laboring to clean it. They know the shame might get worse—by coming into the light—before it can get better. They empathize liberally, and they denounce sparingly. They speak of shameful things in a way that invites disclosure and doesn't drive the issue further underground.

For example, as you preach against abortion, do you put yourself in the shoes of those who have sought abortions? Does your tone and word choice invite confession and repentance, or does your harshness confirm their need for ongoing secrecy?

Are you honest about sin and shame, even while you take people to Jesus for cleansing?

For one blind beggar, the work of God showed him the reproach of Christ so he might bear courageous witness to it. Jesus—who could have given sight with a mere word—spit. Not a nice, clean spittle, but a loogy so wet and slimy that it turned dirt into mud and stuck to the man's eyes (John 9:6). Then Jesus sent him groping across Jerusalem to find a certain pool. Thus, having endured the shame of both blindness and healing, the man faced the Pharisees and staked the claim that earned ejection from the synagogue: "If this man were not from God, he could do nothing" (John 9:33-34).

Finally, when the man asked Jesus who the Son of Man is, he heard something that had never been said to him before: "You have seen him" (John 9:37). Jesus turned his shame into his glory, and he can do the same in our churches today.

by Peter Krol at April 14, 2014 05:01 AM

Growing Up Gothard

"You may run from sorrow, as we have. Sorrow will find you." — August Nicholson in The Village

My wife and I (Ted) were in the mood for a '90s movie, so we rented M. Night Shayamalan's The Village, which actually came out in 2004 but is still a '90s movie in terms of its earnestness and desire to be deep. It succeeds (in being deep) inasmuch as it always makes me think about the church, and about trends in the church.

Village_movieIn a nutshell, it's about a group of academics—all of whom have been deeply wounded by life in a fallen, sinful world—who decide to follow one charismatic leader (William Hurt) into forming an 1800s-style commune on a nature preserve. The idea is that if you take away everything modern and broken and hurtful about the world and replace it with floor-length skirts, suspenders, chickens, and primitive farm equipment, then nothing can hurt you. The movie then spins out a wonderful narrative that illustrates how there is no fleeing from total depravity. It finds us because it is in our hearts to begin with.

Utopia will elude humans, because sin causes the dystopia. Yet we still long for utopia and sometimes try like crazy to create it.

Recently, my friend Derek shared about what life was like growing up inside the Bill Gothard movement in the 1980s and '90s. His account was utterly fascinating both in terms of how weird it was, and also how eerily similar it sounds (in some ways) to how some Midwestern Reformed families are rolling today with the homeschooling, chicken-raising, huge-family-having, government-disdaining, and so on. The Gothard movement, as far as I can tell, was part life-coaching, part para-church organization, part-homeschool curriculum, part-subculture, and part-arena show.

The Village and the Gothard arc show that in spite of our best efforts, sorrow still finds us. Children still get sick and still sometimes rebel.  WE still sometimes rebel and hurt people with our sin. Sorrow found the Gothard/ATI [1] empire recently, amid allegations of years' worth of sexual misconduct.

There are a few encouraging things that surface in Derek's story—namely that he came out of the Gothard experience in one piece spiritually and loves the Lord. His story prompts us to talk and think about what happens when people either follow an individual or a set of culturally mandated standards, and end up making those their operative gospel.

Here's Derek's story, in his words. [2]


I had a great childhood. My parents loved me and did their best to raise me and my siblings to be productive, thoughtful Christians. While I may disagree with some of the principles they followed, I cannot begin to even pretend that I have all of the answers. My reflections on my upbringing are a matter of perspective. I have no intention of misrepresenting Bill Gothard's views or the principles of ATI. I wish to simply share what I felt was overemphasized and underemphasized.

What is the draw?

Our society seems obsessed with systems. Whether raising children or creating your own backyard oasis, someone has a step-by-step guide that will take you to the Promised Land. We also have an obsession with doing things right, so it becomes logical to follow the system that promises the best results. The danger is that we quickly shift the focus from the goal of glorifying God to following a system. We then invest our trust in the effectiveness of the system, rather than the grace of God.

What was it like?

The aspect of ATI that has lingered longest in my life was the expectation of perfection. This idea was applied in a way that overemphasized the role of the individual at the expense of God's involvement. Furthermore, the categories in which perfection was expected extended beyond scriptural commands. A frustrating cycle of commitment, failure, guilt, and then recommitment pervaded my personal life. My family was not a "perfect" ATI family, so this cycle became a practice for our family as a whole.

The mission of ATI maintained an inward focus. Families isolated themselves from the evil influences of those outside of the system. Much like The Village by M. Night Shyamalan, parents secluded their families from outside threats by threatening their own families with God's judgment on rebels and sinners. At the very least, the outside world was painted as a place too dangerous for a Christian to live. The fatal flaw in the system (other than being completely contrary to the missional purpose to which we are called), is that sin was treated as an external force, rather than internal. The focus on the external resulted in a forced attempt at an appearance of godliness, while burying internal struggles.

This quarantined Christianity did not occur with physical barriers, but with outward expressions that demonstrated supposed inward spiritual change. All music with a drumbeat was frowned upon as it had a connection with demonic forces. Contemporary Christian music was just as evil. Cabbage Patch dolls were somehow connected with a devilish force or worldly influence. Circumcision was strongly, strongly, recommended for all males.

"Modest" dress was a must. The rows of navy blue, khaki, and white clothing at ATI conferences was a cross between a well-organized fan section and the North Korean military. Just as important as your dress was the expression on your face. You would be hard-pressed to find a "good" ATI family whose eyes were not shining like high beams while they flashed their pearly whites. If you were missing one of those qualities, you definitely were not going to end up with your family picture in an upcoming publication. Dying your hair was frowned upon as being too worldly, although the rumor was that Bill Gothard justified his own salon treatments as being necessary to prevent distractions regarding his appearance. ATI men did not have facial hair, but I do not know if it was forbidden or if men just wanted to be like Bill, who is sans mustachio.

One of the foundational truths of ATI was the "umbrella of protection." In a family structure, the father was the umbrella that protected his wife and children from Satan's attacks and God's judgments. If you stepped outside of that authority, you would face temptations and wrath. The umbrella came without an expiration date. As a teenager, the gradual increase of responsibility would not coincide with a gradual increase in decision-making. A young man would be eligible to step out from under the umbrella of protection only when he married. A young woman would only transfer from the father's umbrella to a husband's. This authoritarian approach forced the fear of both God and parents to become the main reason for obedience.

The ATI ministry structure was built around the same concept. Leadership within the organization provided the same protection from Satan and God. Questioning or challenging an interpretation of a verse or application of a principle was grounds for removal from the ministry.

Another cornerstone of the "barely in the world, but definitely not of, by, close to, around, or near the world" mentality was the ATI way for members of the opposite sex to interact. Part of the ATI teaching was, "Avoid Defrauding: To defraud another person is to stir up in them desires that cannot be righteously satisfied." While this teaching was specifically focused on "courtship," it outlined a system in which two major errors occurred. First, the blame was directed at the other person for "defrauding." It ignored the responsibility of the individual and encouraged isolationism. Second, the emphasis was on the external, not on the internal. My responsibility as a man was to not touch and not talk about marriage. The girls were responsible for covering up and not being flirty. If that was all taken care of, then nothing sinful could occur within our hearts, right?

Dating was of course far too worldly of a way to find a spouse. Enter courtship. Once a young man was prepared to support a wife and family, he was to approach the father of a young woman whose countenance [3] had caught his eye. As the young man was still under the umbrella of protection of his parents, his parents must approve of his choice, or even better, choose for him. The courtship should then be as short as possible to avoid any potential defrauding. The couple participated in primarily group activities, or chaperoned dates [4]. I distinctly remember listening to a couple tell their courtship story at the national ATI conference. When he proposed, he dropped the ring in his future wife's hand, saving all physical contact for marriage. The audience gave a standing ovation. I just kept wondering what was so bad about putting a ring on someone's finger.

Hero worship was definitely not one of the stated principles of ATI, but was on full display at any ATI gathering and embedded within the ATI curriculum. The Wisdom Booklet [5] often referenced a "hero of the faith" but always seemed to emphasize the strength of the individual over the faithfulness and grace of God. I remember reading D. L. Moody's statement, "The world has yet to see what God can do with a man fully consecrated to him. By God's help, I aim to be that man." Yet the focus was not what God could do, but on what man could do. These "heroes" were portrayed as arriving at a sinless life through dedication to perfection.

In a similar way, the best ATI families were frequently paraded at conferences or trainings. Big families with those beaming countenances were the top draw [6]. Extra-special bonus points were given if the family had a special musical or artistic talent that they could demonstrate for the jealous viewers.  The thought was if we could only be more like Perfect Family, then everything would be so much better. It was a pyramid of legalism. Families networked and advanced through the system based on external factors. Other families worshiped the perfect families, while hating them for the ease with which they seemed to find perfection.

What were the results?

You were expected to be perfect, but the expectation was separate from Christ's righteousness being credited to you. The cross became an event in your past that took you from a negative on the number line of righteousness to zero, neutrality with God. Your advancement beyond zero was predicated on your ability to follow biblical (and sometimes extrabiblical) commands. It was rebuilding the Tower of Babel. Legalism stretched towards the heavens in a futile attempt to reach God, yet ultimately built without God. Despite the attempts, sin shockingly still existed. Grace was ruined and guilt reigned. Sin was routinely condemned, but just as routinely hidden.

True evangelism, sharing the gospel, was nonexistent. We may have been a city set on a hill, but the isolationist mentality made sure that hill was in the middle of nowhere. When there was interaction with others, evangelism amounted to, "Look at how perfect I am. Let me help you be this good."

How do you move forward?

We must first recognize that these man-made systems hold no promise. No political, economic, social, or educational system can guarantee the spiritual results sought. Any faith placed in a system is misplaced. The answer is not a system, but a Savior. A Savior who promises his grace will be sufficient, who promises to complete the work started in us, who promises to remain faithful when we are faithless, and who promises that nothing can separate us from his love. So we recognize who we are, who God is, what he has done, and what HE will do.

[1] Which stands for "Advanced Training Institute." If there was ever a more "'80s-sounding" set of initials and company name, I haven't found it. Derek is trying to find an ATI T-shirt for me so that I can wear it ironically.

[2] All further footnotes by Ted Kluck.

[3] Countenance is Gothard for "face."  When Derek and I were researching this piece he showed me some worksheets from an ATI manual wherein six pencil drawings of clothed women were presented, and you were supposed to pick out what was "trashy" or "defraudy" about each woman's outfit. Aside from all six of the outfits being hopelessly "'80s" all of the women's countenances/faces had been removed, and only a weird, empty oval remained atop their shoulders.

[4] This all sounds so eerily familiar.

[5] What the?

[6] See: things that sound familiar.

by Ted Kluck and Derek Lounds at April 14, 2014 05:01 AM

The American Conservative » Articles

A Turning Point on Crony Capitalism?

One of the most enduring images of the early Reagan years is the “welfare queen.” The not entirely apocryphal story of a woman making a six-figure income on welfare became for conservatives a symbol of everything that was wrong with the welfare state—and for liberals a symbol of everything that was wrong with conservatives.

In the 1990s, that symbol was Claribel Ventura, a 26-year-old mother of six charged with child abuse. She scalded her 4-year-old son’s hands with boiling water to punish him for eating her boyfriend’s food. A Boston Globe investigation determined that Ventura was part of a family of 17 children, 74 grandchildren, and 15 great-grandchildren collecting a total of $1 million a year in public assistance.

The Globe asked one of Ventura’s siblings what she would say to the taxpayers footing the bill for their family. “Just tell them to keep paying,” she replied. Even in liberal Massachusetts, the public was outraged.

Twenty years later, is the new welfare queen General Motors, General Electric, Boeing, and other big corporations receiving taxpayer funds? Have Claribel Ventura and Linda Taylor been replaced by federally financed flops like Solyndra?

Mike Lee, the Republican senator from Utah, is a successor to the conservatives who railed against welfare abuse in the ‘80s and ‘90s. The Tea Party leader does have proposals to revamp anti-poverty programs. But his main target this year is the much more arcane Export-Import Bank, which ostensibly exists to benefit exporters.

“In short, Congress allows the Ex-Im Bank to unnecessarily risk taxpayer money to subsidize well-connected private companies,” Lee wrote in a National Review op-ed, quoting Barack Obama (now an Ex-Im Bank reauthorization proponent) calling it “a fund for corporate welfare” in 2008.

“Whether the beneficiaries of particular Ex-Im Bank loan guarantees are respected, successful companies like Boeing or crony basket cases like Solyndra is irrelevant,” Lee added. “Twisting policy to benefit any business at the expense of others is unfair and anti-growth.”

Republicans have been urged to oppose corporate welfare and cronyism since Reagan budget director David Stockman suggested attacking weak claims, not weak claimants. But as the party of business, GOP politicians have often been reluctant to put their free-market principles into practice when it means allowing the well-heeled or well-connected to fail.

That may be starting to change. House Financial Services Chairman Jeb Hensarling, a Republican from Texas, has jurisdiction over the Ex-Im Bank. He voted against its reauthorization in 2012 and is still pressing to make its survival contingent upon reforms. “I for one remain skeptical that taxpayers ought to be on the hook for this book,” he said in March.

Hensarling may have a powerful ally in Paul Ryan, the House Budget Committee chairman. The Hill reports that Ryan argued in favor of letting the Ex-Im Bank’s authorization lapse in a private meeting—and that House Majority Leader Eric Cantor may be reluctant to go to the mattresses on the issue.

According to The Hill, Cantor “has privately told members he does not intend to get involved this time around, a message that some see as an indication that he is wary of battling conservatives angered by a number of his recent legislative moves.”

Senate Minority Leader Mitch McConnell, who faces both a Tea Party primary challenger and a competitive Democratic opponent this year, voted against reauthorization in 2012. This could be a good chance for him to flex his conservative muscles without alienating swing voters.

Lee contends that the reauthorization battle comes at an opportune time. Authorization lapses on Sept. 30, right before the midterm elections. Republicans could oppose subsidies for big business at the same time the White House and most congressional Democrats are defending them.

The writers and thinkers who brand themselves “libertarian populists” hope this will flip the familiar script where opposing big government is seen as tantamount to supporting big business. They argue that business and government often have a codependent relationship.

It’s probable that fewer Americans could identify the Export-Import bank than could find Ukraine on a map, so this one vote is unlikely to be transformative. But when companies that can afford to look out for themselves receive taxpayer benefits, it would be nice if Republicans were the party advocating for a level playing field.

W. James Antle III is editor of the Daily Caller News Foundation and author of Devouring Freedom: Can Big Government Ever Be Stopped?

by W. James Antle III at April 14, 2014 04:17 AM

assertTrue( )

Pseudogenes Are Not Junk DNA

In 2007,  a PLoS ONE paper by Ahmed et al. proposed a phylogeny for Mycobacteria in which M. leprae (the leprosy organism) is shown as a relatively recent branch off a very long tree, with M. tuberculosis depicted (in a decidedly fanciful schematic) as being of relatively recent provenance (35,000 years), diverging from M. canettii (a recently discovered cousin of tuberculosis) 3 million years ago.

The rather fanciful phylogenetic picture of Mycobacterium evolution presented by Ahmed et al. (2007). Click to enlarge.

The only trouble with this picture is that we know it's wrong. More exacting work has shown that M. tuberculosis is at least 3 million years old, and one paper estimates that the common ancestor of TB and leprosy may go back 66 million years. If the latter figure sounds dubious, consider that until recently, M. leprae wasn't thought to have any sister strains that could aid with dating the organism phylogenetically. But in 2008, the situation changed dramatically when it was realized that in Mexico, a distinct form of leprosy known as "diffuse lepromatous leprosy" (DLL) was actually due to a genetically distinct variant of Mycobacterium known as M. lepromatosis. When the genome for the latter organism was analyzed, it was found to contain the same stupendous assortment of pseudogenes contained in M. leprae, but detailed analysis of polymorphisms in the genomes of the two strains led to a surprising finding: Divergence of the strains appears to have occurred around 10 million years ago.

Another team found that the massive "pseudogenization event" that caused M. leprae (and its cousin, M. lepromatosis) to become saddled with a record number (1,116) of pseudogenes probably occurred on the order of 20 million years ago.

The age and stability of the pseudogenes in M. leprae can only be described as stunning. Conventional evolutionary dogma says that pseudogenes will inevitably be degraded and lost over time. Surely M. leprae can't be conserving and repairing pseudogenes over 10-million-year-long timespans? Pseudogenes are discardable junk.

Or are they?

An analysis of Buchnera aphidicola (the tiny Enterobacterial endosymbiont of the pea aphid) put the half-life of pseudogenes in that organism at 23.9 million years.

Human DNA reportedly contains over 12,000 pseudogenes. Some of these pseudogenes are quite old. Parallel nonsense mutations caused a pseudogenization of the uricase gene in apes during the early Miocene era (17 million years ago). We still carry the pseudogene in question—and it gets transcribed. According to a report by James T. Kratzer and colleagues at the University of Texas, Austin:
Despite being nonfunctional, cDNA sequencing confirmed that uricase mRNA is present in human liver cells and that these transcripts have two premature stop codons.
The inevitable conclusion is that pseudogenes are not, and should not be considered by default, "junk DNA." To the contrary, the default assumption should be that pseudogenes are ancient and conserved—because in most cases, that's exactly what they are.

What causes genes to "go pseudo"? Why are they conserved? What are they really doing? I'll tackle some of those questions in a followup post. Stay tuned.

by Kas Thomas ( at April 14, 2014 04:00 AM

Justin Taylor

Holy Week, Day 2: Monday

Monday, March 30, AD 33.

The following video, filmed in conjunction with our book The Final Days of Jesus, features short explanations from and interviews with New Testament professors Nicholas Perrin (of Wheaton College) and Grant Osborne (of Trinity Evangelical Divinity School), focusing in particular on the cursing of the fig tree, the cleansing of the temple, and the role of the temple in the theology and practice of Jesus. We will be releasing a new video each day this week.

by Justin Taylor at April 14, 2014 04:00 AM

Vivek Haldar

You can't run with the machines

I recently read “The Second Machine Age” by Erik Brynjolfsson and Andrew McAfee. The major points the book makes—that information technology and digitization is exponential, that automation is rapidly expanding into even cognitive tasks that until recently were thought doable only by humans, that the digital worldwide economy rewards winners and leaves even close seconds in the dust—are all quite well-known by now, but the authors have tied them together with a narrative that makes the book a good single place to read about them.

I particularly looked forward to the chapter titled “Racing With The Machines: Recommendations for Individuals”, in which they suggest what kind of work will be left for humans to do, and what skills we should try and develop to stay ahead of automation. I really wanted to know if they had cracked the nut of how we would stay gainfully employed. I was disappointed.

The the authors looked at prior attempts by Levy and Murnane that tried to predict limits on what activities would still require humans. They said that computers are good at following rules and bad at pattern-recognition, giving the example of driving a car as being too cognitively and sensorially complex to be automated. As the authors say in the book: “so much for that distinction.” Then they go right ahead and make their own prediction of the next frontier that computers will be unable to cross:

Our recommendations about how people can remain valuable knowledge workers in the new machine age are straightforward: work to improve the skills of ideation, large-frame pattern recognition, and complex communication instead of just the three Rs. And whenever possible, take advantage of self-organizing learning environments, which have a track record of developing these skills in people.

The biggest problem with this advice is that it is skating to where the puck is, not where it is going to be.

The example they lean on the most is freestyle chess where teams of humans and chess programs compete against each other (hence the title of the chapter). It turns out that the combination of a chess program guided by a human is more powerful than either alone. Freestyle chess is also a central example in Tyler Cowen’s latest book. Unlike Brynjolfsson and McAfee, Cowen wonders if this is just a transitional phase, and if humans will ultimately not add any value in this pairing.

Their recommendation about “ideation” and “large-frame pattern recognition” is not concrete enough. What does that mean specifically for someone choosing a college major today? And more importantly, can we be sure that those activities will remain out of the reach of computers by the time they graduate?

The debate about whether human thought is computable is an open one, but the vast majority of human cognitive activity does not happen anywhere near that threshold. In an ironic similarity to the diagonal cut, perhaps the only people secure in their isolation from automation are mathematicians probing the bounds of what is computable, and how.

But each one of the rest of us has to wonder if, within our lifetimes, our jobs are automatable. I program for a living, and while a good fraction of what I do is intellectually challenging, there is also some that makes me feel like just an operator.

Many non-technical authors think of Kasparov losing to Big Blue as a major milestone in AI, but that was largely due arriving at a point in time that Moore’s Law delivered enough computing beef to “solve” an exponentially complex game like chess. A more meaningful threshold would be when insight can be computed. For example, could a computer propose an algorithm with the simple elegance of Quicksort?

“Running with the machines” is a temporary strategy at best. That is simply the halfway house between humans doing something and it being fully automated. A more accurate phrase would be “run by the machines”, because cheap human labor is crowdsourced for the kinds of problems that are just barely outside a computer’s (current) reach.

I see two strategies for staying ahead of automation:

The first is to be the one doing the automation. In other words, to be a programmer. (Massive disclaimer: of course I would say that, being a programmer.) More bluntly, be the hunter instead of the hunted. The problem is that not everybody is able or willing to do that.

The second strategy is to be a doctor of machines. Large computing systems show organic behavior, and tending to them often requires the same mindset and behaviors as a doctor diagnosing and tending to patients. I like to draw an analogy to cities. A (relatively) small number of people initially construct the infrastructure (pipes, wires, roads), but a much larger number of people maintain them continuously.

We will have interesting lives.

April 14, 2014 03:35 AM


don't code today what you can't debug tomorrow

Supersonic JavaScript

A few days ago, I gave a talk at the most recent Web Tech Talk meetup hosted by Samsung. The title is Supersonic JavaScript (forgive my little marketing stunt there) and the topic is on changing the way we think about optimizing JavaScript code.

None of the tricks presented there will make your code break the sound barrier. Nevertheless, some of them can serve as the food for thought to provoke our brain to look at the problem in a few different ways. If you want to follow along, check or download the slide deck (before you ask: it was not video recorded).

I discussed four different ideas during the talk.

Short Function. Back in the old days, function calls were expensive. These days, modern JavaScript engines are smart enough and can do self-optimization. For some details on this optimization, read my previous blog post Automatic Inlining in JavaScript Engines and Lazy Parsing in JavaScript Engines. There is no need to outsmart the engine and therefore stick with a concise and readable code.

Fixed Object Shape. This swings in the other direction. How can we help the engine so that it can take the fast path most of the time? For more information, refer to my blog post JavaScript object structure: speed matters.

Profile Guided. Related to the previous point, can we control our own code so that it takes the fast path whenever possible but will still fall back to the slow path everynow and then? What we need is a set of representative data for the benchmark and the profile can be used to tweak the implementation. More details are available in my two other blog posts Profile Guided JavaScript Optimization and Determining Objects in a Set.

Garbage Handling. Producing a lot of object often places a burden on the garbage collector. As an illustration, check out a short video from Jake Archibald describing the situation of using +new Date.

There is no silver bullet to any performance problem. However, like I already mentioned in my Performance Calendar’s article JavaScript Performance Analysis: Keeping the Big Picture, it is important to keep in mind: are we always seeing the big picture or are we trapped into optimizing to the local extreme only?

Now, where’s that TOPGUN application form again…

by Ariya Hidayat at April 14, 2014 02:48 AM

Natural Running Center

Run Fearless in Boston

April 21 will be my twenty-first running of the Boston Marathon. That’s a pleasing symmetry, but I also know that I will be experiencing unpredictable emotions. At last year’s marathon, I had already finished the race and was with my family, safe in a nearby hotel a block away, when the terrorism attack occurred. The bombings [Read more...]

by BillK at April 14, 2014 02:24 AM

The Aporetic

Blogging and the return of the repressed

Earlier today I was at an OAH roundtable on blogging as scholarship. There were a number of distinguished bloggers there (Anne Little;  Kenneth Owen; Benjamin Alpers; John Fea), all of whom blog in very different styles. The audience was great.

I thought I’d post some of my argument about why blogging is valuable as scholarship: it boils down to the fact that our notion of scholarship and “historical work” is deranged. Not just in the fact that we have, preposterously, only three real forms, the conference paper, the article, and the book: it’s also that the the style of academic discourse is grotesquely psychologically conflicted.

I’ve argued before that the way we’re taught to read bears no relation to the way we are taught to write. We are taught to write as if our audience was a learned man of leisure, with servants, and we’re taught to read like sou-chefs gutting a fish: quickly, ruthlessly, under time pressure. We are asked to construct a form of self punishment–we write for the person we wish we could be, and in reading destroy that person we imagined.

Notice there's no one there

Notice there’s no one there

Don’t take my word for it. You can clearly see this conflicted self in the contrast, growing wider every year, between the text of any academic history and the acknowledgments. Open any book  of footnoted academic history published in the last two decades, and the text will almost never use the word “I,” almost never mention anything personal, never describe intellectual struggle or uncertainty. The text will aim to erase the author altogether, so the argument emerges full grown like Athena from the head of Zeus. But the acknowledgements! The acknowledgments are a virtual carnival of the self, full of confessions of doubt, descriptions of struggle, metaphors of journey and passage and transformation; yearnings and regrets and intimacies: salutes to comrades professional and personal, the fallen and the still standing. The acknowledgements are colorful, personal and self indulgent: the text is personless and self banishing.

Something’s not right here! I mean, mentally not right. The division between the text and the acknowledgements is as wide or wider than the division between the way we are taught to write and the way we are taught to read. It is a sign of repressed desires and wishes. Really, from a distance it look like a mental illness.

Blogging maybe has the potential to reintegrate the fragmented academic personality. It makes the personal visible. It allows for struggle; it is the journey towards meaning. It allows for an authorial voice that speaks through itself, instead of through some disembodied imagined person. It’s embedded in community. And it doesn’t involve the violent forms of self-erasure that the acknowledgments keep proving we want to escape

by mike at April 14, 2014 02:23 AM

CrossFit Naptown

3 must do’s for Olympic Lifting

Today’s workout:

Back Squat
5×3 80%
As close to EMOM as possible or 8 minutes cap

WOD (15min cap)
4 rounds
3 burpee pull ups
5 Hand Stand Push Ups (sub BOX HSPU or 10 hand release push ups)
7 Power Cleans (155/115)


Great Article that Coach Kevin Found:

Three Habits of Great Lifters
Written by Bryan Miller

Here are three habits (that are a must!) for those looking to increase general strength, compete in a strength sport, or overall just be a great lifter!

1. Movement Quality vs. Chasing PRs

Patience is one of the most noticeable characteristics of an experienced lifter. All too often beginners will chase PR after PR until their progress comes to a screeching halt.

Why does this happen? I like to call it the ‘novice effect’. Each person has certain strength/lift potential with poor movement quality. The novice lifter will progress rapidly at first just by getting stronger, but the progress is short lived and WILL plateau. When the strength potential is finally reached, the only way to become a better lifter is to do what should have been done first – and that is to focus on movement quality. To solve this problem, technique should be of the utmost importance until the lifter is proficient enough to begin breaking personal records with perfect technique. The movement should look the same, whether it is a PR or one hundred pounds below a PR.

2. Training vs. Expressing Strength

When the lifter has a good level of technique, then the real training can begin! However, technique should continue to be refined and never forgotten; nobody is perfect.

Expressing strength should be reserved for competitions or certain days in training. How often do you see Kendrick Farris (or any other elite lifter) max out in training? Not very often. On the contrary, there are many videos of him (and others) doing multiple reps of squats or high rep snatches in sets of 3s and 5s.

One of the best powerlifters of all time, Ed Coan, would hit training maxes every 12 weeks. Leading up to the training max, he would squat for 10s, 8s, and 5s. Quit chasing PRs and start getting stronger.

3. Work on Your Weaknesses

During one of Dimitry Klokov’s seminars, he was asked “Why do you press and deadlift so much?” His response was that those are his two weakest body parts and when he focused on those weaknesses, his Olympic lifts improve. You will notice he didn’t answer snatch or clean and jerk all day, every day. He has found what works for him through varying his training. The point is that you need to find what YOUR weaknesses are and attack them with vigor.

by Peter at April 14, 2014 02:17 AM

One Big Fluke

My talk from PyCon 2014

Code samples are here. Slides are embedded below (use slide forward/back buttons for best effect). Or download a PDF of the slides.

Video hopefully will be uploaded after the conference and I'll repost that too. Update: Here's the video!

by Brett Slatkin ( at April 14, 2014 02:16 AM

512 Pixels

April 13, 2014


Willa Cather, My Antonia


Opening Passage: The story has an introduction in which an unnamed female narrator, representing Cather, introduces Jim Burden, the primary narrator of the story, but Burden's manuscript is the major beginning of the work.

I first heard of Ántonia on what seemed to me an interminable journey across the great midland plain of North America. I was ten years old then; I had lost both my father and mother within a year, and my Virginia relatives were sending me out to my grandparents, who lived in Nebraska. I travelled in the care of a mountain boy, Jake Marpole, one of the 'hands' on my father's old farm under the Blue Ridge, who was now going West to work for my grandfather. Jake's experience of the world was not much wider than mine. He had never been in a railway train until the morning when we set out together to try our fortunes in a new world.

Summary: The epigraph for My Ántonia is from Virgil's Georgics, a line that will be found as well in the main body of the text: Optima dies...prima fugit, "The best days flee first." The combination of nostalgia and reflection on the life in the country represented by this quotation from Virgil's pastoral summarizes the book very well.

The narrator, Jim Burden, arrived on the prairie as a young boy, and there in the frontier lands he discovers a wild mix of immigrants: Swedes, Norwegians, Bohemians, and the like. Among the Bohemians he meets a young girl, slightly older than he, Ántonia Shimerda, and begins a lifelong friendship. Much of the story is about Burden remembering Ántonia as she grows from an immigrant country girl on the farm to a "hired girl" in the city acting as governess and housekeeper while she and other girls her age become a little crazy for dances. Then Burden goes off to university and Ántonia falls in love with a man, Larry Donovan, who abandons with her a baby before they even marry. And it ends with Burden reuniting with Ántonia, who is now married and on a farm with ten or eleven children. It's a simple enough story. The major dramatic point, Ántonia's being abandoned, is entirely offstage, because this is not a dramatic story but a nostalgic one. When we think back on the stories of our lives, we do not think in terms of climactic plots and denouements, nor rising and falling action; we remember little doings and happenings that flow into bigger doings and happenings as they fade away in memory and evidence. Hopes and dreams burgeon, constantly changing, some leading to good things and some dissipating like clouds, and the point is not some specific struggle or profound crisis but instead layers and layers of stories interwoven with each other.

At one point Burden says of Ántonia, "Ántonia had always been one to leave images in the mind that did not fade—that grew stronger with time.... She lent herself to immemorial human attitudes which we recognize by instinct as universal and true." And, as he continues, what this means is that she -- and by extension her story -- "somehow revealed the meaning in common things." Human lives are like that; we live well when the goodness of common things is brought out by our lives, as if our lives consisted of planting and tending and harvesting meaning in the simple things of the world. Ántonia's story is not some wild, romantic adventure; it is the kind of story that people live everyday. Btu that's the point, of course. It's fitting that as the story nears its end Burden and Ántonia meet again to tell stories over old pictures and memories. Those kinds of stories are not exciting creative adventures; they are usually not 'original' in any rigorous sense of the word, just being tales of ordinary things. But they are the most fundamental stories of human life, the true heart of human story, not artificial entertainments, but the way we naturally tell the tales of our own lives.

Favorite Passage:

Before I could sit down in the chair she offered me, the miracle happened; one of those quiet moments that clutch the heart, and take more courage than the noisy, excited passages in life. Antonia came in and stood before me; a stalwart, brown woman, flat-chested, her curly brown hair a little grizzled. It was a shock, of course. It always is, to meet people after long years, especially if they have lived as much and as hard as this woman had. We stood looking at each other. The eyes that peered anxiously at me were—simply Antonia's eyes. I had seen no others like them since I looked into them last, though I had looked at so many thousands of human faces. As I confronted her, the changes grew less apparent to me, her identity stronger. She was there, in the full vigour of her personality, battered but not diminished, looking at me, speaking to me in the husky, breathy voice I remembered so well.

Recommendation: Very good and highly recommended.

by Brandon ( at April 13, 2014 10:58 PM

Lift Big Eat Big

Training The Farmer's Walk

Article written by Matt Mills

The farmers walk is one staple of my programming that I never take out.  If you want to get bigger, leaner, and more athletic, then the farmers walk is the answer.  If you are a strongman competitor, the farmers walk is absolutely essential to your training. If you are not a competitor of any kind, the farmers walk is about as functional as it gets, and should be performed by anyone.  No, I’m not talking about squatting on a bosu ball because that’s considered “functional” for some reason beyond me.  I’m talking about an exercise that we all do single day.  Any time you carry a weight you are performing a farmers walk to some degree, hence where the name came from.  One of the most rewarding things as a strength coach is when I have a member of my gym come to me all excited telling me how they were able to carry all of their groceries in at the same time.  If you ever need help, moving be sure to ask the guy or girl that has a good farmers walk!

If you are a powerlifter, or just someone looking to get their deadlift up (and who isn’t?!) you want to carry some heavy weight in your hands.  One of the biggest benefits of doing farmers is the increased grip strength.  Whenever I see someone miss a deadlift because of their grip I can’t help but cringe.  Having a strong grip is essential to living a healthy life, and ladies, I know you don’t like asking your guy to open that jar up.  For those of you interested in more fat loss, the farmers walk is a perfect finisher.  Literally every muscle in your body must work to either stabilize, or move the weight efficiently.  The more muscle groups you work, the greater the metabolic effect is burning calories.  One of my favorite benefits of the farmers walk is the amount of work your traps get.  Whenever you see someone with some big traps and neck, you know they have put some work into the gym.  In fact the farmers walk is my favorite builder for the traps, and is the first exercise I suggest when someone asks for advice.  

The core is taxed heavily here, and is one of the best ways to get strong abdominals without doing any direct work.  For you competitors that have a weakness on the farmers walk, and it is not your grip, then it is your core.  A great way to fix this is with suitcase carries.  Simply take one farmers handle, and load it up to a fairly light weight of about 50%-60% of your max and carry it for a given distance.  I generally stick with 50-100 feet.  Make sure you stay as tall as possible, and do not slouch to the side you are carrying the load.  You will feel your obliques of your opposite side screaming by the end.  Suitcase carries are also another great way to increase your deadlift, as strong obliques are essential to a big pull.  Another great way to build your core strength is to carry uneven loads.  Load one weight up to 75% of your max and the other to 50%.  The challenge to stay upright will be extremely difficult. Improve on your core strength with these variations and watch your farmers go through the roof.

I would say for most competitors, grip is the biggest weak point on the farmers walk, and its what the event really tests in a contest.  For those of you who are against straps, this is the reason why they are allowed in strongman on the deadlift most of the time.  Strongmen arguably have the best grip in the world, and it is tested heavily on events like this.  Put farmer walks in a contest with a husafel, keg, sandbag carry, stones and your grip will be fried by the end.  If you are looking to hit farmers as hard as possible, look into getting a pair of lifting straps and use them on your deads and heavy rows.  I do have to make a statement about one thing in your training that I get asked about quite a bit, and have even read in another article.  NEVER USE STRAPS ON A FARMERS WALK…EVER!!! 

When grip is your main weak point, you have a couple ways to make it your strength.  First, use a pair of fat grips to make the farmers handles much thicker.  You will have to drop the weight down quite a bit but once your grip increases from the thicker handles, the normal farmers will feel like tooth picks.  Another option that I love to do on my final set of heavy farmers is to hold them once you finish your carry.  Squeeze the handles are hard as possible and for as long as you can.  Just make sure you save this for your last set because otherwise your grip will be shot for you other sets.

In a strongman competition, you will most likely run into a farmers event that will have a turn involved.  Turning with farmer handles is incredible hard, and will tax your grip immensely.  There are a couple tricks to help you master the turn.  A common mistake I see are competitors trying to make a complete stop at the turning point, staying in place while slowly turning around, and then trying to pick up speed again.  Stopping is only going to slow you down, and make you have to hold onto the handles even longer then you should have to.  Instead of making this mistake, take a wider turn and keep your feet moving, so you don’t lose any speed.   

You will have to slow down slightly to keep the farmers under control, but you will make the turn much faster, and be able to pick up momentum once you make the full 180 degrees.  The most important part here is when you start to come around from the turn, you must not let the farmers handles continue to turn.   Turning with the handles at a heavy load is extremely difficult to control, and without controlling them they will continue to turn you until you lose your grip.  Right before you feel the handles start to turn you, push back against them hard in the opposite direction.  When you turn against the handles it will actually keep them straight in line, and allow you to continue to the finish line in a straight shot.  Finally while turning do your best to keep the plates in contact with one another.  Once the plates stagger the turn will even be more difficult as the load will be unbalanced.

Here are a couple quick tips that make a huge difference on the farmers walk:
Grip the handles not in the middle but just a hair back from the center.  Your grip mainly comes from your index, middle finger, and thumb.  I grip the farmers in the middle then move them back just a half inch.  Once you start moving with the weight the handles will dip slightly making you move the weight faster from the momentum.

Dig your hand into the handle for your grip.  You should curl your wrist in as deep as possible.  Once you pick it up your wrist will straighten.  This will pinch your hands more but your grip will be better, which is more important than your sensitive hands.

Use a staggered stance when the weight is light enough.  This is a little trick that will save you a second or two on your time.  If you start in a staggered stance you can take a step right away as you pick the weight up.  I line the toes of my back foot up with the heel of my front foot. 
Take short choppy steps, do not try to take long strides as this will make the handles swing more, making them much harder to control.  

Programming Options
I like to make my training more difficult then it will be for an upcoming contest.  If this is possible for you always train an event slightly heavier then what you will be doing in a contest.  For example, for the Arnold Classic I recently competed in I had to carry 345lbs in each hand for 75 feet.  Leading up to the contest I trained only for 100 feet starting at a lighter load of 280lbs.  By the final heavy week of training my heaviest carry was 365lbs in each hand for 100 feet.  If you are not able to go slightly heavier, then work at the heaviest load you are able to for the given distance of the contest.  Once you have reached a heavy max drop the weight down to about 65% and perform multiple sets of speed runs with short rest periods of 60-90 seconds.  As always, find what your weak point is on the farmers as I have outlined above and make it your strength!

by Brandon Morrison ( at April 13, 2014 10:05 PM

The Urbanophile

On the Riverfront

Thursday I took a look at my “Cincinnati conundrum,” namely how it’s possible for a city that has the greatest collection of civic assets of any city its size in America to underperform demographically and economically. In that piece I called out the sprawl angle. But today I want to take a different look at it by panning back the lens to see Cincinnati as simply one example of the river city.

There are four major cities laid out on an east-west corridor along the Ohio River: Pittsburgh, Cincinnati, Louisville, and St. Louis (which is not on the Ohio River, but close enough. I’ll leave Memphis and New Orleans out of it for now). All of these are richly endowed with civic assets like Cincinnati is, having far more than their fair share of great things, yet they’ve all been stagnant to slow growing for decades.

This suggests a broader challenge: if urbanity and quality of life are so determinant of economic success, why aren’t these places juggernauts? It’s not that they are failures by any means, but they are long term under-performers.

Over the Rhine, Cincinnati – one example of the spectacular urban assets of these cities

I don’t pretend to have all the answers, but since these cities share many characteristics, I wanted to show what they have in common. Doubtless some of these common threads play a role.

These cities came of age earlier than railroad based cities like Chicago. These are some of the earliest major cities in the region, and they owe their prominence to the era when the river was the major form of transport. They’ve all had a heavy German Catholic influence, hence the legacy of breweries and the importance of private Catholic high schools in these areas even today. They have bridge-oriented transportation traffic patterns and bottlenecks. They’ve got interesting geography with hills and trees and some similar climate patterns.

I find it particularly interesting that they have similar political geographies, despite being in four different states. Three of them are multi-state metros, obviously, because the rivers are state borders. But beyond that they all have hyper-fragmented systems of lots of tiny cities and villages that are fiercely independent. Here’s a map of all the municipalities in St. Louis County, for example:

Image via ArchCityHomes

All of these cities ceased annexing early and got hemmed in. St. Louis famously detached itself from the county completely to become an independent city. Only Louisville with its recently city-county merger grew out of this. But Louisville’s Jefferson County still features numerous sixth class cities and such that were excluded from merger, some of which are only a couple blocks in size. Hamilton County, Ohio and Allegheny County, Pennsylvania are similar.

Inside the cities themselves, there are also many well defined, distinct neighborhoods. These are usually small in size compared to what are called neighborhoods in cities like Chicago. Also, there can be deep divisions between the different sides of town. These are very divided cities. Cincinnati has the East Side-West Side divide. Louisville has the East End, the South End, and the West End. And which one you are from is a huge cultural marker. The North and South Sides of Indianapolis are very different and have some sniping back and forth, yet I don’t see the same visceral suspicion across the sides of town compared to say how Louisville’s South End (mostly working class white) sees the East End (the favored quarter). That helps explain why it took Louisville 40 years to build new Ohio River bridges, and why Cincinnati had to overcome unbelievable obstacles to build a streetcar.

These cities are also provincial and insular in their character. As a transplant to Louisville put it, “Louisville is parochial in all the best and worst ways.” These are cities with rich, unique architectural traditions, and with tremendously distinct local cultures compared to other cities in their region such as Indianapolis or Columbus, which have been largely Genericaized. So Cincinnati has its chili. St. Louis has its pizza. Pittsburgh even has its own yinzer dialect. In at least three of the four of these cities – I don’t know about Pittsburgh – the first question you get asked is “Where did you go to high school?” which tells you almost everything you need to know about them.

While provincialism is almost inherently negative as a term, this has big upsides for these cities too. They have an incredible sense of place and uniqueness. The brick houses of St. Louis are unlike anything else, for example. Again, the feel of these places is very notable in contrast to neighbors like Columbus and Indy, which give off a Sprawlville, USA vibe.

Trailer for film Brick: By Chance and Fortune. If the video doesn’t display for you, click here. Please ignore the unfortunate preview image.

This provincialism comes with two associated character traits. One is a degree of solipsism. Solipsism is the philosophical proposition that nothing can be known to exist outside the self. It’s different from egotism. Egotism says you’re better than everybody. Solipsism says there isn’t anybody else. Obviously we’re talking degrees here, not absolutes. But this is key I think to the retention of those local traditions and local character.

I’ll give an example that illustrate this. Cincinnati arts consultant Margy Waller made a comment to me a few years ago that really stuck with me. She said that when people leave Cincinnati and come back, the stuff they did and learned while they were away might as well not have happened. She left and worked for several years in Washington, including in the Clinton White House. I’m not sure exactly what she did there, but if you’re working in the White House, by definition you’re operating at a bigtime level. But that’s barely mentioned in Cincinnati. Few people ever ask how her DC network or experience can inform or support the city.

Similarly Randy Simes is an instructive case. A graduate of the University of Cincinnati planning school, he got a job with a tier one engineering firm in Atlanta. But he also started and ran the blog Urban Cincy, which is a relentlessly positive advocate for the city and maybe its most effective marketing voice to the global urbanist world (the Guardian listed it as among the best urban web sites on the planet). Eager to come back to Cincinnati, he looked for a job there. But he couldn’t find one. Here’s a guy with 1) legitimate professional credentials 2) a top tier firm pedigree 3) the city’s most effective urban advocate 4) non-controversial, positive, and aligned with the political structure of the city and 5) he’s 24-25 years old and so it’s easy to hire him – you don’t need an executive director position or something. Yet no interest. Shortly thereafter he was head hunted by America’s biggest engineering firm to move to Chicago and then was sent on an expat assignment to Korea where he’ll be working on, among other things, one of the world’s most prominent urban developments (one that Cincinnati actually flew people in from Korea to present to them about). Jim Russell had a very similar experience with Pittsburgh.

The relationship of prophets and home towns has been known for some time, so I don’t want to pretend this is a totally unique case. But I can’t help but compare Randy’s case to blogger/advocate Richey Piiparinen in Cleveland, for whom an entire research center was created at Cleveland State (admittedly, he was already local at the time). I just don’t think Randy’s accomplishments outside Cincinnati resonated.

And secondly, these places do sometimes cross over into a sort of hauteur. I think because these were all very large, important cities in their earlier days and because they had so much amazing stuff, it bred a sort of aristocratic mindset perhaps. Having lived in both Louisville and Indianapolis, I clearly see the difference. In Indianapolis cool people will happily tell you how awesome they think St. Louis, Cincinnati or Louisville are. They’ll make visits to say the 21C Hotel or Forecastle Festival in Louisville and write and say great things about it and even how they wish Indy had some of those things.

But people from Louisville would rather bite their tongues out than say nice things about Indianapolis. If forced to, they will, but they do it in the most grudging way. I’ll never forget a travel guide for Louisville called the “Insiders Guide to Louisville” (I believe different than the one currently being sold under that name). In the intro they were bragging about Louisville’s totally legitimate food scene, but they had to throw in a gratuitous insult by saying something along the lines of, “Every city has good restaurants these days – even Indianapolis, we hear – but Louisville’s restaurants are truly special.” When Indianapolis Monthly did its “Chain City, USA” cover on Indy’s restaurants, I had to send it to my friends in Louisville since I knew they’d eat it up gleefully. (If you watched the St. Louis brick film trailer, you’ll also notice someone in it throwing a similar gratuitous dart at the Illinois brick used in Chicago).

Hot off the presses is this travel piece on Indianapolis written by someone in Louisville. As a travel piece, by is going to be positive by the very nature of the genre, but note the way the writer frames up the trip:

I bristle whenever I hear about flyover country – my home of Louisville is smack in the heart of what east and west coasters think is just the space they have to cross to get from one good part of the country to another – so I should be a little more open minded. But maybe because of my fondness for my hometown, it turns out I’ve been harboring a bit of the same snobbery that those fliers do – toward a northern neighbor.

My friend Kristian was bragging to me about Indy’s tech scene one day. I’d just gotten back from Cincinnati where I’d gotten to see their tech scene showcased, tour the Brandery accelerator, etc. So I said, “What about Cincinnati? Looks like they are rocking and rolling.” Kristian was like, “Oh yeah, they’re awesome. I was just down there and they totally get it, there’s some great stuff going on.” Then he made a comment that I think summed it up: “You know what though? They’re in love with their own story.”

That sums it up. These cities are in love with their own stories. That perhaps also explains a bit of it. With so many amazing assets it’s easy to be complacent. It reminds me of the famous quote from the triumphant (and boosterish) Chicago Democrat as Chicago started to pull away from St. Louis as the commercial capital of the Midwest: “St. Louis businessmen wore their pantaloons out sitting and waiting for trade to come to them while Chicago’s wore their shoes out running after it.”

If you’re too in love with your own story, you’re not going to work as hard as you should to take that story to the next level. After all, the story of these cities isn’t finished yet. But there’s a new generation in these places that aren’t wedded to the old ways. They love the story, but have some chapters of their own they want to write. As urban assets they have come back into fashion in the market, it will be interesting to see how they evolve. As the press for Pittsburgh shows, for example, there’s already plenty of signs of an inflection point. And in a region where places tend to flagellate themselves, having some cities with a bit of honest to goodness civic hauteur can actually be a refreshing change.

The Urban State of Mind: Meditations on the City is the first Urbanophile e-book, featuring provocative essays on the key issues facing our cities, including innovation, talent attraction and brain drain, global soft power, sustainability, economic development, and localism. Included are 28 carefully curated essays out of nearly 1,200 posts in the first seven years of the Urbanophile, plus 9 original pieces. It's great for anyone who cares about our cities.

by Aaron M. Renn at April 13, 2014 09:30 PM

The Outlaw Way


Note: Percentages after lifts note drop sets based off maxes. There is no prescribed rest for any work.


1) Clean: Work to a 3rm (non-T&G, reset quickly after drop) – 1X3@95%, 1X3@90%

2) Jerk (off blocks): Work to a 3rm (drop all reps) – 1X3@95%, 1X3@90%


1) Jerk Drives (with 3 count pause at bottom of dip): 3X5 @ 110-120% of max Jerk

2) Push Press: Work to a 5rm – 1X5@95% of 5rm, 1X5@90% of 5rm


1A) Back Squat: Work to a 5rm – 1X5@95% of 5rm, 1X5@90% of 5rm
1B) Bench Press: Work to a 5rm – 1X5@95%, 1X5@90%

The post 140414 appeared first on The Outlaw Way.

by at April 13, 2014 09:18 PM


Ok, I thought we’d have Power and Connectivity up by today, but the internets has failed us. You may see a post pop up later for both of them – if, of course, we can get them fixed.

Also, The Outlaw Way and Outlaw Barbell will be sharing some common elements for the next 15 weeks. However, as I will explain later, do not try to use the Outlaw Barbell template as additional work to “help your lifts”. It is a Weightlifting program, meant to make you good at Weightlifting. Stay on The Outlaw Way if you are not fully committed to competing in WL. Now to the cool stuff…

Outlaw Barbell is back, and we’ve now got a proven template to share with all of you. Back at the end of December, Jared and I discussed what direction to take the program. We thought about bringing in another national coach to program, but decided against it because we knew that no one was really as familiar with the needs of competitive exercisers switching to Weightlifting as we would like. So we decided to come up with our own template based off of a combination of what Jared does, and what I thought was necessary for success with the transition. We decided to let the Outlaw Barbell site go dark until we had enough time and data to make sure our template was just right.

Today, 16 weeks after starting the template, and on the day we targeted as the re-launch date, 58kg Caitlin Vodopia hit a 14kg meet PR total of 161kg. Caitlin was one of our original test subjects, and an important one. She was a nationally competitive lifter before she started the template, and had crossed over to Weightlifting from competitive exercise. We couldn’t be any happier with the results we’ve seen so far, and can’t wait to see how you guys fare.

Let me make this clear: this is not a supplemental template to help with your lifts, this is for those of you looking to make the switch to full-time Weightlifting. It is also not a casual template, it’s hard as hell. If you have questions please post them on the Barbell blog. Also, we’ll be posting quite a few technique videos for the “no-name” style of lifting that we’ve been working on with the Outlaw Barbell crew.

WOD 140414:


1) Clean: Work to a 3rm (non-T&G, reset quickly after drop) – 1X3@95%, 1X3@90%, rest as needed

2) Jerk (off blocks): Work to a 3rm (drop all reps) – 1X3@95%, 1X3@90%, rest as needed


1A) 4XME Strict Pullups – rest 60 sec.
1B) 4X5 Push Press – heaviest possible, rest 60 sec.



For time:

Row 1K
50 Thrusters 45#
30 Pullups

The post 140414 appeared first on The Outlaw Way.

by at April 13, 2014 09:16 PM

Kevin DeYoung

Preaching Carefully on Palm Sunday

Just to be clear: the crowd on Palm Sunday welcoming Jesus with shouts of “Hosanna!” is by and large not the same crowd on Good Friday that demands his death with shouts of “Crucify!”

This is a popular point preachers like to make, and I’ve probably made it myself: “Look at the fickle crowd. They sing songs to him on Sunday and five days later on Friday they want to kill him. How quickly we all turn away.” But read all four gospel accounts carefully (and check some good commentaries). The excited throng on Palm Sunday was filled with Galilean pilgrims and the larger group of disciples, not the Jerusalem crowd in general (see Luke 19:37; Mark 15:40-41; John 12:12, 17).

R.T. France summarizes:

There is no warrant here for the preacher’s favourite comment on the fickleness of a crowd which could shout ‘Hosanna’ one day and ‘Crucify him’ a few days later. They are not the same crowd. The Galilean pilgrims shouted ‘Hosanna’ as they approached the city, the Jerusalem crowd shouted, ‘Crucify him.’

Have a blessed Holy Week that sticks closely to all sorts of glorious texts.

by Kevin DeYoung at April 13, 2014 08:29 PM

sacha chua :: living an awesome life

Weekly review: Week ending April 11, 2014

A lot of coding this past week – moving stuff to Github, fixing bugs, making things a little more convenient… Two Emacs chats, too.

Started gardening again. =D Yay weather warming up!

Next week:

Blog posts


I think my focus on sketches is inversely proportional to my focus on code. They probably tickle the same part of my brain…

  1. 2014.04.07 Working fast and slow #experiment

Link round-up

Focus areas and time review

  • Business (40.7h – 24%)
    • [ ] Build: Find a user-friendly RSS plugin for WordPress
    • [ ] Earn: E1: 2.5-3.5 days of consulting
    • [ ] Explore converting ClojureBridge tutorial to Org
    • [ ] Explore membership plugins / course plugins
    • [ ] Record session on learning keyboard shortcuts
    • [ ] Write about planning for reasonable safety
    • Earn (16.7h – 41% of Business)
      • [X] E1 Unpinkify
      • [X] E1: Check for subscribers
      • [X] E1: Load people into comm
      • [X] Earn – E2: Re-render video 3 if necessary
      • [X] Earn – E2: Set up video 3?
      • [X] Earn: E1: 2.5-3.5 days of consulting
    • Build (20.8h – 51% of Business)
      • [X] Check that all my WordPress installations are up to date
      • [X] Get Emacs to show me a month of completed tasks, organized by project
      • [X] Improve Emacs Beeminder
      • [X] Make it easier to cross-link Org
      • [X] Package miniedit for MELPA?
      • [X] Run Hello World in Clojure from Emacs
      • [X] Sort out cache slam
      • [X] Sort out task templates and captures so that refiling, jumping, and clocking is easy
      • [X] Stop loading d3js
      • Drawing (1.5h)
      • Delegation (1.2h)
        • [X] Post Emacs tutorials links
      • Packaging (7.4h)
        • [X] Fix cover for Sketchnotes 2012
        • [X] Annotate my Emacs configuration
        • [X] Draw “A” page for Emacs ABCs
        • [X] Draft guide to getting started with Emacs Lisp
        • [X] Learn about bitbooks
        • [X] Review Sketchnotes 2012 digital proof
      • Paperwork (0.5h)
        • [X] File payroll return
        • [X] Plan my business and personal finances
    • Connect (3.2h – 7% of Business)
      • [X] Emacs Chat: Tom Marble
      • [X] Emacs chat prep: Iannis
      • [X] Emacs chat: Iannis Zannos – music
      • [X] Invite technomancy for an Emacs Chat
  • Relationships (12.9h – 7%)
    • [X] Attend W-’s family thing
    • [X] Check results for project F
    • [X] Get the Raspberry Pi camera working and get a top-down view
    • [X] Go to RJ White’s semi-retirement party
    • [X] Set up the Pi camera again
    • [ ] Raspberry Pi: Use bounding rectangle to guess litterbox use
    • [ ] Raspberry Pi: Extract blob pixels and try to classify cats
  • Discretionary – Productive (18.0h – 10%)
    • [X] Flesh out story
    • [X] Write monthly report taking advantage of Org tasks
    • [ ] Blog about user-visible improvements, Beeminder commit goal
    • [ ] Experiment with calculating ve
    • [ ] Plant beets, spinach, lettuce
    • [ ] Ask neighbours if anyone wants to split a bulk order of compost with us
    • Writing (5.9h)
      • [X] Write about discretionary speed
  • Discretionary – Play (2.8h – 1%)
  • Personal routines (21.5h – 12%)
  • Unpaid work (11.7h – 6%)
  • Sleep (61.2h – 36% – average of 8.7 per day)

The post Weekly review: Week ending April 11, 2014 appeared first on sacha chua :: living an awesome life.

by Sacha Chua at April 13, 2014 08:26 PM

Rands in Repose

Protecting Yourself from Heartbleed

Earlier this morning, I tweeted:

This is not actually good advice. You shouldn’t be changing your password on a server until the server administrator has confirmed whether their servers were affected and, if so, whether the server has been patched.

Mashable appears has an up-to-date breakdown of the most popular services out there and their disposition relative to Heartbleed.


by rands at April 13, 2014 07:35 PM

Text Patterns

the keys to society and their rightful custodians

Recently Quentin Hardy, the outstanding technology writer for the New York Times, tweeted this:

If you follow the embedded link you’ll see that Head argues that algorithm-based technologies are, in many workplaces, denying to humans the powers of judgment and discernment:

I have a friend who works in physical rehabilitation at a clinic on Park Avenue. She feels that she needs a minimum of one hour to work with a patient. Recently she was sued for $200,000 by a health insurer, because her feelings exceeded their insurance algorithm. She was taking too long.

The classroom has become a place of scientific management, so that we’ve baked the expertise of one expert across many classrooms. Teachers need a particular view. In core services like finance, personnel or education, the variation of cases is so great that you have to allow people individual judgment. My friend can’t use her skills.

To Hardy’s tweet Marc Andreesen, the creator of the early web browser Mosaic and the co-founder of Netscape, replied,

Before I comment on that response, I want to look at another story that came across my Twitter feed about five minutes later, an extremely thoughtful reflection by Brendan Keogh on “games evangelists and naysayers”. Keogh is responding to a blog post by noted games evangelist Jane McGonigal encouraging all her readers to find people who have suffered some kind of trauma and get them to play a pattern-matching video game, like Tetris, as soon as possible after their trauma. And why wouldn’t you do this? Don't you want to “HELP PREVENT PTSD RIGHT NOW”?

Keogh comments,

McGonigal ... wants a #Kony2012-esque social media campaign to get 100,000 people to read her blog post. She thinks it irresponsible to sit around and wait for definitive results. She even goes so far as to label those that voice valid concerns about the project as “games naysayers” and compares them to climate change deniers.

The project is an unethical way to both present findings and to gather research data. Further, it trivialises the realities of PTSD. McGonigal runs with the study’s wording of Tetris as a potential “vaccine”. But you wouldn’t take a potential vaccine for any disease and distribute it to everyone after a single clinical trial. Why should PTSD be treated with any less seriousness? Responding to a comment on the post questioning the approach, McGonigal cites her own suffering of flashbacks and nightmares after a traumatic experience to demonstrate her good intentions (intentions which I do not doubt for a moment that she has). Yet, she wants everyone to try this because it might work. She doesn’t stop to think that one test on forty people in a controlled environment is not enough to rule out that sticking Tetris or Candy Crush Saga under the nose of someone who has just had a traumatic experience could potentially be harmful for some people (especially considering Candy Crush Saga is not even mentioned in the study itself!).

Further, and crucially, in her desire to implement this project in the real world, she makes no attempt to compare or contrast this method of battling PTSD with existing methods. It doesn’t matter. The point is that it proves games can be used for good.

If we put McGonigal’s blog post together with Andreesen’s tweet we can see the outlines of a very common line of thought in the tech world today:

1) We really earnestly want to save the world;

2) Technology — more specifically, digital technology, the technology we make — can save the world;

3) Therefore, everyone should eagerly turn over to us the keys to society.

4) Anyone who doesn’t want to turn over those keys to us either doesn't care about saving the world, or hates every technology of the past 5000 years and just wants to go back to writing on animal skins in his yurt, or both;

5) But it doesn't matter, because resistance is futile. If any expresses reservations about your plan you can just smile condescendingly and pat him on the head — “Isn’t that cute?” — because you know you’re going to own the world before too long.

And if anything happens go to astray, you can just join Peter Thiel on his libertarian-tech-floating-earthly-Paradise.

Enjoy your yurts, chumps.

by Alan Jacobs ( at April 13, 2014 07:00 PM

Planet Lisp

Paul Khuong: Number systems for implicit data structures

Implicit data structures are data structures with negligible space overhead compared to storing the data in a flat array: auxiliary information is mostly represented by permuting the elements cleverly. For example, a sorted vector combined with binary search is an implicit in-order representation of a binary search tree. I believe the seminal implicit data structure is the binary heap of Williams and Floyd, usually presented in the context of heapsort.

I find most developers are vastly more comfortable dealing with pointer-based data structures than with implicit ones. I blame our courses, which focus on the former and completely fail to show us how to reason about the latter. For example, the typical presentation of the binary heap introduces an indexing scheme to map from parent to children – the children of heap[i] are heap[2 * i] and heap[2 * i + 1], with one-based indices – that is hard to generalise to ternary or k-ary heaps (Knuth’s presentation, with parents at \(\lfloor i/2 \rfloor\), is no better). The reason it’s so hard to generalise is that the indexing scheme hides a simple bijection between paths in k-way trees and natural numbers.

I find it ironic that I first encountered the idea of describing data structures or algorithms in terms of number systems, through Okasaki’s and Hinze’s work on purely functional data structures: that vantage point seems perfectly suited to the archetypal mutable data structure, the array! I’ll show how number systems help us understand implicit data structures with two examples: a simple indexing scheme for k-ary heaps and compositions of specially structured permutations to implement in-place bit-reversal and (some) matrix transpositions.

Simple k-ary max heaps

The classical way to present implicit binary heaps is to work with one-based array indices and to root an implicit binary tree at heap[1]. For any node heap[i], the children live at heap[2 * i] and heap[2 * i + 1]. My issue with this presentation is that it’s unclear how to extend the scheme to ternary or k-ary trees: if the children of heap[1] are heap[3 * 1 ... 3 * 1 + 2], i.e., heap[3, 4, 5], what do we do with heap[2]? We end up with k - 1 parallel k-ary trees stashed in the same array. Knuth’s choice of mapping children to parents with a floored integer division suffers the same fate.

One-based arrays hides the beauty of the binary indexing scheme. With zero-based arrays, the children of heap[i] are heap[2 * i + 1] and heap[2 * i + 2]. This is clearly isomorphic to the one-based scheme for binary heaps. The difference is that the extension to k-way trees is obvious: the children of heap[i] are heap[k * i + 1 ... k * i + k].

A couple examples like the one below fail to show any problem. However, even a large number of tests is no proof. Thinking in terms of number systems leads to a nice demonstration that the scheme creates a bijection between infinite k-ary trees and the naturals.

There is a unique finite path from the root to any vertex in an infinite k-ary tree. This path can be described as a finite sequence \((c\sb{1}, c\sb{2}, \ldots, c\sb{n})\) of integers between 1 and k (inclusively). If \(c\sb{1} = 1\), we first went down the first child of the root node; if \(c\sb{1} = 2\), we instead went down the second child; etc. If \(c\sb{2} = 3\), we then went down the third child, and so on. We can recursively encode such finite paths as naturals (in pidgin ML):

path_to_nat [] = 0
path_to_nat [c : cs] = k * path_to_nat cs + c

Clearly, this is an injection from finite paths in k-ary trees to the naturals. There’s only a tiny difference with the normal positional encoding of naturals in base k: there is no 0, and digits instead include k. This prevents us from padding a path with zeros, which would map multiple paths to the same natural.

We only have to show that path_to_nat is a bijection between finite paths and naturals. I’ll do that by constructing an inverse that is total on the naturals.

nat_to_path 0 = []
nat_to_path n = let c = pop n
                in c : nat_to_path (n - c) / k

where pop is a version of the mod k that returns k instead of 0:

pop n = let c = n `mod` k
         if (c != 0)
          then c
          else k

The base case is nat_to_path 0 = [].

In the induction step, we can assume that path_to_nat cs = n and that nat_to_path n = cs. We only have to show that, for any \(1 \leq c \leq k\), nat_to_path path_to_nat c:cs = c:cs. Let n' = path_to_nat c:cs = k * n + c.

\[n\sp{\prime} = kn + c \equiv c\quad \mod k,\] so pop n' will correctly return c (and k rather than 0). It’s then a tiny bit of algebra to show that (n' - c) / k = n, and we fall back to the induction hypothesis.

This scheme is so simple that I wound up coding a version of heapsort(3) that lets callers choose the heap’s arity at runtime. Higher arity heaps perform more comparisons but fewer swaps than binary heaps; the tradeoff is profitable when sorting large items. It seems to me that, for decades, we’ve been presenting implicit heaps and heapsort in a way that marginally simplifies the binary case at the expense of obscuring the elegant general case.

Array permutations as algebra on positional number systems

Bit reversing an array of length \(2\sp{l}\) sends the element x[i] to x[j], where the binary representation of i (including padding up to l bits) is the reverse of j. For example, in an array of length 16, 3 = 0011 becomes 1100 = 12.

Reversing a fixed-width integer’s binary representation is its self-inverse, so bit reversing an array is a sequence of swaps. This means that the permutation can be performed in-place, as a series of independent swaps. Bit reversal used to be slow on cached machines: contiguous elements (with indices that only vary in their low order bits) swap with far-off elements (indices that only vary in their high order bits). Worse, the stride between the latter elements is a large power of two, which causes all sorts of aliasing issues. Workarounds (see Zhang 99 (PDF)) mostly end up implementing a software cache with explicit buffers. Nowadays, even L1 caches have such a high associativity that aliasing is a much less important issue.

Napa-FFT3 implements bit reversals by calling a few specialised functions that only swap the lower and higher bits; the main routine iterates over an array of precomputed middle bit reversals (similar to various publications of Elster’s, but recursing on the middle bits first). In this implementation, the number of L1 cache misses incurred by bit reversing an array is only slightly greater than the compulsory misses. Bit reversal isn’t free, but it’s also not clear that autosorting FFTs are quicker than out-of-order FFTs followed by a bit reversal pass.

Bit reversal is the only array permutation I’ve seen described in terms of its effect on indices. I think it’s a fruitful avenue for other in-place permutations.

For example, the viewpoint makes it clear how to transpose a matrix of dimensions \(2\sp{m} \times 2\sp{n}\) with a sequence of in-place bit reversals (each \(i\sb{k}\) and \(j\sb{l}\) is a bit in the index’s binary representation).

For a row-major layout, the sketch above corresponds to:

  1. bit reverse each row of length \(2\sp{n}\) in place;
  2. bit reverse the whole vector of length \(2\sp{m + n}\) in place;
  3. bit reverse each new row of length \(2\sp{m}\) in place.

Bit reversal, like all other permutations, is a reversible linear operation. We can thus change the order of operation if we want to. For example, it’s not necessarily preferable to bit-reverse contiguous rows first. We could also flip the high-order bits of the indices: rather than swapping scalars, we would swap rows. Separately bit reversing contiguous rows works best when each row fits in cache. Bit reversing columns instead amortises the bad access pattern inherent to bit reversal by spending more time on each swap: swapping rows is slower than swapping scalars, but also very efficients with regards to (streaming!) I/O.

This is interesting because in-place transposition of rectangular matrices is hard, and transposition is already a bad fit for caches. Transposing matrices with a sequence of bit reversals might just be practical. In fact, that’s what I intend to do in Napa-FFT3 for multi-dimensional DFTs: we can fuse all but the middle whole-vector bit reversal with mixed-radix FFTs (and the latter might similarly benefit from operating on [sub-]rows rather than scalars).

One obvious question now appears: can we generalise the trick to general dimensions? It’s pretty clear that we can do it for any other base \(b\) and matrices of dimension \(b\sp{m} \times b\sp{n}\) (it’s interesting how highly composite numbers dimensions are easy to transpose, and, IIRC, so are coprime ones). What if there’s no such factorisation? The best I can do is “more or less.”

For arbitrary matrix dimensions \(m \times n\), I think it’s best to decompose indices in a mixed radix (but still positional) number system. For example, a \(63 \times 21\) matrix might have indices in radix \(3,7\ |\ 3,7,3\). Given this number system, matrix transposition is

It’s a small generalisation to let the radices be \(a,b\ |\ a,b,a\), for a matrix of dimension \(ab \times a\sp{2}b\). We can then perform most of a matrix transposition by swapping positions of identical weight: first a full mixed-radix digit reversal (the weights are palindromic), followed by another mixed-radix reversal on the first three positions.

This leaves the last chunk \(b\sb{2},a\sb{3}\), which should instead be \(a\sb{3},b\sb{2}\). That’s another rectangular matrix tranpose, but smaller than the original one. It might be practical to execute that last step with a straightforward out-of-place transpose: a smaller transpose needs less scratch space and may fit in cache. We can also apply the same trick as for bit reversals and apply the transpose before everything else, by permuting rows rather than scalars. The simplest way to do that is to transpose a matrix of pointers before replicating the permutation on the actual data (glibc’s mergesort references Knuth vol. 3, exercise 5.2-10).

Finally, this also works for \(a\sp{2}b \times ab\): matrix transposition is its own inverse, so we only have execute the inverse of each step, in reverse order.

Definitely mixed results, but at least we have some intuition on why general rectangular transpositions are hard to perform in place: they’re hard to express as sequences of swaps.

Next: C code and cycle counts!

This post is the more theoretical prologue to a low-level look at qsort(3): I really wanted to make sure the nice implicit tree layout in the first section had the space it deserves.

I tried to make in-order implicit trees fit in this number system approach. I can’t see how. The problem is that in-order trees associate ranges (rather than indices) with nodes; for example, at what depth is index 1000? It depends on the size of the search tree. It might be the root (in a tree of 2000 vertices) or a leaf (in a tree of 1001 vertices).

April 13, 2014 04:20 PM


This Week in Runblogging: 4/7 to 4/13 2014

Pearl Izumi Road N0Last week seemed to herald the official arrival of Spring here in New Hampshire. Almost all of the snow has finally melted, the sun has been shining, and the trails have been muddy and wet rather than snowy and icy (great test for the Salomon Sense Pro’s that have been on my feet a lot lately). Feels great to not have to bundle up in multiple layers for every run, and I’m hoping the tights can stay in the closet for the next nine months!

There’s nothing quite like warming weather, sunshine, and improving fitness to make a runner content!

Below are the posts that were published on Runblogger from 4/7-4/13:

Do You Pronate?: A Shoe Fitting Tale
April 10, 2014 – A post about my experience listening to a woman get fitted for shoes at a local sporting goods store, with some thoughts on the pronation control model of fitting shoes

Blowing the Dust Off the Brooks Vantage
April 8, 2014 – Jon Gugala’s second guest post on the new Brooks Heritage line of vintage running shoes.

Pearl Izumi EM Road N0 Racing Flat Review
April 7, 2014 – Review of one of my current favorite running shoes. The PI N0 is a fast racing flat that reminds me of the Saucony Grid Type A5. And that’s a very good thing!

Wear the Brooks Vanguard and Stop Looking Like a Run Nerd
April 7, 2014 – Jon Gugala’s first guest post on the new Brooks Heritage line of vintage running shoes.

Recommended Articles From Around the Web

Lots of reviews to share this week:

1. My review of the Pearl Izumi EM Road N0 wasn’t the only one that came out last week – Steve Speirs over at Run Bulldog Run also reviewed the N0, and he had similarly positive thoughts: “The Road N0 is a great example of less is more. My size 9.5 weighed in at just 6.5oz, making it one one of the lightest shoes in my current rotation. However, just because it’s light, doesn’t mean that it’s a harsh ride – the Road N0 delivers a smooth, responsive ride with a perfect balance of cushioning and comfort.”

New Balance Fresh Foam Trail2. Sam Winebaum reviews the New Balance Fresh Foam 980 Trail. He concludes: “The 980 Fresh Foam Trail is a low drop (4mm), very solid, supportive mid weight trainer suitable for both smooth and rough trails. It is on the heavier side (10.25 oz)  of modern trail runners which often come in under 10 oz, but given the cushion, deep and effective lugs, and rugged upper I think worth the weight for old legs, longer runs and tougher terrain.” Interestingly, he also reports that early versions of the road 980 had a manufacturing error and the midsole was firmer than spec.

3. Detroit Runner reviews the Garmin Vivofit fitness tracker. I’ve had a Vivofit for several weeks now and am really liking it (though I’m not loving the new Garmin Connect website). Jeff has had a similar experience, so I thought I’d share it here in advance of my own review (coming soon!).

4. Thomas Neuberger reviews the Altra Superior 1.5. Like the original superior, traction is not great, but they offer a roomy toebox and a flexible sole for light, non-technical trail duty.

Skora Fit5. Minimallyshoddy reviews the Skora Fit. The Fit is a more cushioned shoe than most others offered by Skora, and more affordably priced. For minimallyshoddy it ticks all the right boxes: “My perfect shoe is a shoe I’m not thinking about while I’m running in it. It should disappear. Even when I’m on pavement I want to be able to close my eyes and feel like I’m running barefoot. No shoe is that good, but the FIT is just a great combination of all of the qualities I look for in a shoe.”

by Peter Larson at April 13, 2014 03:55 PM


Stepping away

I’m turning on moderation so I can step away for a week or two.

by Dalrock at April 13, 2014 03:09 PM


reflector: One more for Archers

In the interest of parity, and since there have been a lot of Debian-only posts in the past, here’s reflector — an Arch-only trick.


Mirror management is usually an easy-to-forget, one-time task when building a system, but it might be worth keeping reflector in mind.

I’ve used rankmirrors plenty of times, and if there’s no other available option, it does a fine job. But rankmirrors does expect you to do a little background work, and at times can be a bit time-consuming. All of which is easy to work around, of course.

reflector, in my humble opinion, has the added bonus of being able to filter mirrors by geographical area, which is great if you’re a world traveler and want to update between stopovers.

Or it might just be that some of the mirrors rankmirrors gave you are sluggish or remote, in which case reflector might have a few better ideas for you.

And of course, the best place to learn about reflector is on the one-and-only Arch wiki, which is only the best source for Linux information in the universe. Regardless of your distro. ;)

Tagged: mirror, rank, sort, test

by K.Mandla at April 13, 2014 02:13 PM


The Ridiculous Entry into Jerusalem

ridiculous entryToday we begin Holy Week, the last week of Jesus’ pre-Resurrection ministry, by celebrating Palm Sunday and his Triumphant Entry into Jerusalem. Here is the standard account in Matthew:

Now when they drew near to Jerusalem and came to Bethphage, to the Mount of Olives, then Jesus sent two disciples, saying to them, “Go into the village in front of you, and immediately you will find a donkey tied, and a colt with her. Untie them and bring them to me.  If anyone says anything to you, you shall say, ‘The Lord needs them,’ and he will send them at once.” This took place to fulfill what was spoken by the prophet, saying,

 “Say to the daughter of Zion,
‘Behold, your king is coming to you,
    humble, and mounted on a donkey,
    on a colt, the foal of a beast of burden.’”

 The disciples went and did as Jesus had directed them.  They brought the donkey and the colt and put on them their cloaks, and he sat on them. Most of the crowd spread their cloaks on the road, and others cut branches from the trees and spread them on the road. And the crowds that went before him and that followed him were shouting, “Hosanna to the Son of David! Blessed is he who comes in the name of the Lord! Hosanna in the highest!” And when he entered Jerusalem, the whole city was stirred up, saying, “Who is this?” And the crowds said, “This is the prophet Jesus, from Nazareth of Galilee.” (Matt. 21:1-9)

To ears trained by a couple thousand years of church history to hear these Hosannas as those of glorious choirs, and the donkey as a dignified steed, we miss the glorious irony of this most ridiculous of all entries. John Calvin highlights how foolish the whole thing would have been:

This would have been a ridiculous display, if it had not been in accordance with the prediction of Zechariah, (9:9.) In order to lay claim to the honors of royalty, he enters Jerusalem, riding an ass. A magnificent display, truly! more especially when the ass was borrowed from some person, and when the want of a saddle and of accouterments compelled the disciples to throw their garments on it, which was mark of mean and disgraceful poverty. He is attended, I admit, by a large retinue; but of what sort of people? Of those who had hastily assembled from the neighboring villages. Sounds of loud and joyful welcome are heard; but from whom? From the very poorest, and from those who belong to the despised multitude. One might think, therefore, that he intentionally exposed himself to the ridicule of all.

And yet, this was necessary because:

…in consequence of the time of his death being at hand, he intended to show, by a solemn performance, what was the nature of his kingdom. So then, as his removal to heaven was at hand, he intended to commence his reign openly on earth….But as he had two things to do at the same time, — as he had to exhibit some proof of his kingdom, and to show that it does not resemble earthly kingdoms, and does not consist of the fading riches of this world, it was altogether necessary for him to take this method. (Harmony of the Gospels, Vol 2, Comment on Matthew 21:1)

This is the way the King came announcing his kingdom: in humility, poverty, absurdity, and weakness. And yet, because of this, we see all the more clearly that it “does not consist in the fading riches of this world.” The gold and the pomp we might have expected would have only obscured the true glory of our King.

So then, as we sing our hosannas today, and lift our palms to the King of glory, let us recall his humble, and, indeed, ridiculous entry into Jerusalem.

Soli Deo Gloria

by Derek Rishmawy at April 13, 2014 01:54 PM


rubyripper: The options continue to multiply

I should apologize for the gap in communications for a few days this week. I was preoccupied with some personal events, and as luck would have it, I find I am also beset with computer issues. More on that later.

For now, rubyripper is next on the Master List.

2014-04-13-6m47421-rubyripper-01 2014-04-13-6m47421-rubyripper-02 2014-04-13-6m47421-rubyripper-03

We just saw ripit a week ago or so, and while it’s true that there’s only so much you can do with a console-based CD ripper utility, rubyripper seems quite competent.

True, it doesn’t seem to have as many low-level controls as ripit, but there are distinct audiences among computer users, and for some, a simpler, quicker interaction (notice I didn’t say “interface”) is better.

So much as you can see above, rubyripper has options to rip to flac, ogg or mp3, with command-line flag controls manually edited. Suffice to say, if you don’t know what the controls are for lame, you’ll probably just want to keep the defaults. ;)

One thing I like about rubyripper is the default rip location. Without any prompting, rubyripper dropped the resulting tracks into a folder called “vorbis,” which kept them from polluting my home directory. My OCPD thanks you, rubyripper.

abcde is my first-line pick for console CD ripping, but rubyripper has its charms. Following the theme I mentioned with ripit though, sometimes it’s not so much about the overarching program as the underlying support software.

A sad note: It seems the author has drifted away from the project, citing a lack of need to rip CDs any longer, since online services are quicker and easier. Much like I have said several times over, there doesn’t seem to be much call for CD conversion, and I don’t even know how many of my friends own CDs any longer. :(

It’s the circle of life.

One last bonus with rubyripper: Install ruby-gtk2 and get … a graphical interface!


I don’t see rubyripper in Debian, which is a bit of a shame. So … Arch :) Debian :( That could always change though. ;)

Tagged: audio, cd, disc, music, rip, ripper

by K.Mandla at April 13, 2014 01:41 PM

Crossway Blog

The Final Days of Jesus: Sunday, March 29, AD 33


In this week’s video series, well-known New Testament scholars explore the background and significance of the history-shaping events that occurred during Jesus’s last week on earth. Designed as a supplement to The Final Days of Jesus, our prayer is that these videos will help deepen your understanding and experience of Holy Week.

The Final Days of Jesus: Palm Sunday
from Crossway on Vimeo.



The Final Days of Jesus: The Most Important Week of the Most Important Person Who Ever Lived
Andreas J. Köstenberger and Justin Taylor, with Alexander Stewart

Combining a chronological arrangement of the biblical text with insightful commentary, this book serves as a day-by-day guide to Jesus’s final week on earth, complete with a quick-reference glossary and color maps.

Free Downloads:
Excerpt / Study Guide / 40-Day Reading Guide


by Matt Tully at April 13, 2014 01:30 PM

Caelum Et Terra

Palm Sunday


* God has a perfect plan for your life. But it is as incomprehensible as He Is.

* Human constructs can illuminate or veil reality. Worse, many of the illuminative ones, even the ones not planted by human hands, can become veils and usually do.

* Any ‘theology’ that believes that most humans are going to suffer for all eternity is not worthy of consideration, for the ‘God’ it portrays is not worthy of worship: a total failure and cruel, too.

* In fact, the holy Being we call ‘God’ sets a very low standard. If that is not true, if most people are hell-bound, I might as well give up. And so should you, if you possess any self-knowledge at all.

* Fortunately, we can look at the ones he has chosen in sacred history and see how condescending ‘God’ is: aside from His Mother, it is  a collection of knaves and knuckleheads. Which should give us hope. Universalism? Not quite; one must leave room for human freedom, even the craziest kind that would reject Love for Self, even when that Self, by the choice, is reduced to a cold hard turd of a thing.

* In fact, if that most noxious of theologies, Calvinism, is true, we may as well ‘curse God and die’, for not only is He worthy of a curse, but we are without hope.

* Oh, except for the ‘elect’. Who are in for one Hell of a surprise.

* When I was younger I thought I was born in the wrong age. This is not an uncommon thing for romantics. And I did not like ‘modern’ science when I was in school; whatever Einstein was doing, high school science when I was a kid in the 60s was rationalist and mechanistic, and ran entirely against the mythopoetic approach to reality that came naturally to me. And anthropocentric: the assumption was that Man was about to conquer the natural world, make it sit up and beg. Not now; science if anything reveals a universe infinitely more mysterious and intricate than ever we could have imagined. And ultimately more beautiful and incomprehensible and, well, humbling. The age of scientism is over, as mysticism and science merge more and more.

* Meanwhile, ‘traditionalist’ Catholics have produced a documentary, by splicing together segments of interviews with physicists, taken out of context, promoting …. geocentrism.

* Me, I had another half-fast Lent. I am no longer a super Catholic, let alone a super Byzantine Catholic. I observed only the minimal ecclesial ‘fast’, like the average Catholic that I am.

* But I have three teenagers (four, if you count my precocious 11 year old, Maria), and that is plenty of penance, thank you. Plus, I just went through a five month period of intense physical deprivation. It was called ‘the worst winter ever’. What? It was imposed, not willed? So are all the best fasts.

* For what it is worth, I have pondered my sinfulness more this Lent than I ever did when I was observing all the traditional ascetic rules. Whole days I have spent walking around, realizing that I have been an ungrateful asshole more often than not, and full of pride to boot. And I can’t seem to keep my mouth shut when I should, engaging in fruitless arguments and general smartassedness.

* ‘Oh Lord and Master of my life, keep from me the spirit of indifference, despondency, lust for power and idle chatter. Instead, give to your servant the spirit of chastity, humility, patience and love. My Lord and King, give me the grace to be aware of my sins, without judging my brothers and sisters. For You are blessed, now and forever, unto ages of ages. Amen.’

* A blessed Palm Sunday and Holy Week to all….

by Daniel Nichols at April 13, 2014 12:55 PM

Text Patterns

the internet and the Mezzogiorno

Auden on Ischia, by George Daniell

From the late 1940s to the late 1950s, W. H. Auden spent part of each year on the Island of Ischia in the Bay of Naples. When he bought a small house in Austria and left Italy, he wrote a lovely and funny poem called "Good-bye to the Mezzogiorno" in which he reflected on how he, as the child of a "potato, beer-or-whiskey / Guilt culture," never became anything more than a stranger in southern Italy.

As he thinks about the people of that region, he wonders if, despite the liveliness of the culture, they might be "without hope." And he muses, 

                                This could be a reason
Why they take the silencers off their Vespas,
    Turn their radios up to full volume,  

And a minim saint can expect rockets — noise
    As a counter-magic, a way of saying
Boo to the Three Sisters: "Mortal we may be,
    But we are still here!"

I thought of this poem the other day when I saw this story about how NPR played a little trick on its Facebook fans: giving them a headline that was not accompanied by an actual story, but that people commented on — vociferously, confidently — anyway. Writing like this, and it constitutes the vast majority of all online commenting, is not so much an attempt at communication or rational conversation as it is an assertion of presence: "Mortal we may be, / But we are still here!" And the more assertive your comments are, the harder it is to deny your presence. Abusing people whose (often imagined) views you disdain is like taking the silencer off your Vespa; writing in all caps is like turning your radio up to full volume.

Which raises the question of why so many people feel so strongly the need to announce their presence in the internet's comboxes. Surely not the for same reason that people like me write blog posts! 

by Alan Jacobs ( at April 13, 2014 12:38 PM

Justin Taylor

Holy Week, Day 1: Palm Sunday

Sunday, March 29, AD 33.

The following video, filmed in conjunction with our book The Final Days of Jesus, features short explanations from and interviews with New Testament professors Doug Moo (of Wheaton College Graduate School) and Andreas Köstenberger (of Southeastern Baptist Theological Seminary). We will be releasing a new video each day this week.

by Justin Taylor at April 13, 2014 04:00 AM

assertTrue( )

Whooping Cough Genomics

Pertussis, also known as whooping cough, is a highly contagious respiratory infection caused by Bordetella pertussis, a small aerobic bacterium that secretes numerous toxins capable of disrupting a normal immune response. The disease is rarely fatal but leaves victims with a nasty cough that can last weeks. In 2012, in the U.S., some 48,277 cases of pertussis were reported to the CDC. Of those cases, only 20 were fatal. By contrast, 28 Americans were killed by lightning the same year.

Bordetella pertussis
Unlike tuberculosis (which has been with us for 3 million years), Bordetella shows evidence of being a fairly new (and still rapidly evolving) pathogen, although in this case "fairly new" could still mean 700,000 years.

The complete DNA sequence of B. pertussis has been available for several years. It shows a moderate-size genome (of 4 million base pairs) encoding 3,447 genes, with a substantial number (360) of pseudogenes. The latter represent genes that have (by one means or another) been inactivated, whether through the appearance of premature stop codons in the gene, loss of a promoter region, random deletions, or what have you.

What makes Bordetella's pseudogenes interesting is that they're in remarkably good shape, as pseudogenes go. Usually, once a gene gets inactivated (goes pseudo), it begins to accumulate random point mutations, deletions, insertions, etc. at a substantial rate. In other words it deteriorates, since (supposedly) it's no longer under selection pressure. But when Australian researchers looked at 358 pseudogenes in B. pertussis Tohama I strain, they were shocked to find that the rate of nucleotide polymorphisms (i.e., changes to individual base-pairs in the DNA) was actually lower in pseudogenes than in regular genes (4.7E-5 per site versus 5.1). That's exactly the opposite of what's expected. The researchers commented, somewhat laconically: "This suggests that most pseudogenes in B. pertussis were formed in the recent past and are yet to accumulate more mutations than functional genes."

What other explanation is there? Well, the most obvious alternative explanation is that the genes are still under selection pressure, even though they're turned off. How can that be? I can think of any number of scenarios; perhaps that'll be a future blog post. Suffice it for now to say, ribosomes are not totally unforgiving of missing stop codons (read up on tmRNA) nor are they unforgiving, in all cases, of frameshifts (read about programmed frameshifts), and if an open reading frame should appear on a pseudogene's antisense strand, you now have an RNA silencer (potentially) for the remaining good copy or copies of the gene, with attendant gene-modulation possibilities.

It's worth pointing out that pseudogenes in M. leprae (the leprosy bacterium) are not only conserved and ancient but continue to show strong homology to working orthologues in M. tuberculosis (and even more distantly related organisms such as Gordonia, Corynebacterium, and Nocardia) after millions of years. More of which, in a later post.

For now, I thought it might be worth looking at the base composition of B. pertussis pseudogenes to see if they're riddled with frameshift errors (as is the case with M. leprae's pseudogenes). When I analyzed all 1,125,521 codons for all normal (not pseudo) genes in B. pertussis Tohama I strain, the resulting "paintball diagram" of base composition came up looking like this:
Paintball diagram for normal genes in B. pertussis Tohama I (click to enlarge). Red dots are for codon base one, gold represents the composition at codon base two, blue is "wobble" (third) base composition. Every dot represents statistics for one gene (n=3447). See text for discussion.

Here, we're looking at purine (A+G) content versus G+C content for each base position in the codons. Every dot represents a gene's worth of data. Not unexpectedly, the most extreme G+C values occur in the third ("wobble") base. Codon base one (red dots) is purine-rich, centering on y=0.58. This is typical of most codons in most genes, in most organisms. Notice the "breakaway cloud" of gold points underneath the main gold cloud (at y<0.4). These points represent genes in which the second codon base is mostly a pyrimidine (C or T). Codons with a pyrimidine in base two tend to code for nonpolar amino acids. Thus, the breakaway cloud of gold points represents membrane-associated proteins. In this case, we're looking at about 558 genes falling in that category.

Now look at the paintball diagram for the organism's 360 pseudogenes:

Base composition for "codons" in 360 pseudogenes of B. pertussis Tohama I. (Click to enlarge.) In this graph, as in the one above, dots are rendered with an opacity of 60% (so that overlapping points are less likely to obscure each other). See text for discussion.

In this case, there's a considerable amount of random statistical splay, but some of that is due simply to the fact that pseudogenes are a good deal shorter than normal genes, giving rise to more noise in the signal. (In this case, the average length of a pseudogene is 482 bases, vs. 982 for the 3,447 "normal" genes.) Even with considerable noise, though, it's apparent that the dot clusters tend to center on different parts of the graph, corresponding to the expected locations for normal genes. (Contrast this with the situation in M. leprae, where pseudogenes are riddled with frameshifts, rendering the concept of "codon base position" moot. Refer to the second paintball graph on this page.) Thus, we can say with some confidence that frameshifts are not so rampant in B. pertussis pseudogenes as to have rendered the concept of codons irrelevant. In fact, compared to M. leprae, pseudogenes in B. pertussis are comparatively unaffected by frameshifts. This tends to support the view of the Australian researchers (mentioned earlier) that pseudogenes in B. pertussis have not had enough time to accumulate very many mutations. But it can also be hypothesized that B. pertussis has had plenty of time (700,000 years, in fact) in which to accumulate mutations in its pseudogenes, yet has not done so. The evidence suggests that if anything, Bordetella repairs pseudogenes even more faithfully than regular genes.

At this point it might be relevant to interject that while M. leprae (like other members of the Mycobacteria) lacks the MutS/MutL mismatch repair system, Bordetella does, in fact, have a MutS/MutL mismatch repair system, and this may explain the relative paucity of frameshift errors in Bordetella pseudogenes. But it also implies (rather queerly) that Bordetella goes out of its way to repair its pseudogenes.

Interestingly, 234 out of 360 pseudogenes have a AG1 (purine, base one) content greater than 55%, which means they're probably still "in frame." Of these 234, some 69 (30%) have AG2 less than 40%, meaning they're most likely genes for membrane-associated proteins. If we look at the 2,456 normal genes that have AG1 greater than 55%, only 398 (16%) are putative membrane-associated proteins (with AG2 less than 40%). Bottom line: Pseudogenes for putative membrane-associated proteins are twice as likely to still be in-frame. While this could be a statistical fluke, it could also be that membrane proteins are somehow "spared" preferentially when it comes to leaving pseudogenes translatable. To put it differently: Pseudogenes for non-membrane-associated proteins are less likely to remain in-frame. This makes sense, in that much of Bordetella's pathogenicity can be ascribed to proteins that make up cell-surface antigens or that transport toxins to the outside world. Some of the toxic surface proteins may, in fact, be nonsense (or partial-nonsense) proteins—products of pseudogenes.

by Kas Thomas ( at April 13, 2014 04:00 AM

Doc Searls Weblog

Earth to Mozilla: Come back home

Inmoz her blog post explaining the Brendan Eich resignation, Mitchell Baker, Chair of the Mozilla Foundation, writes, “We know why people are hurt and angry, and they are right: it’s because we haven’t stayed true to ourselves.” In Mozilla is HumanMark Surman, Executive Director of the Foundation, adds, “What we also need to do is start a process of rebirth and renewal. We need to find our soul and our spirit.”

That spirit is embodied in the Mozilla Manifesto. But it goes deeper than that: all the way back to Mosaic, the ur-browser from which Firefox is descended by way of Netscape Navigator.

Neither Mosaic nor Navigator were instruments of the advertising business. They were boards we rode to surf from site to site across oceans of data, and cars we drove down the information superhighway.

But now all major browsers, Firefox included, have become shopping carts that get re-skinned at every commercial site they visit, and infected at many of those sites by cookies and other tracking files that report our activities back to advertising mills, all the better to “personalize” our “experience” of advertising and other “content.”

Economically speaking, Firefox is an instrument of advertising, and not just a vehicle for users. Because, at least indirectly, advertising is Firefox’s business model. Chrome’s too. (Apple and Microsoft have much smaller stakes in advertising, and offer browsers mostly for other reasons.)

This has caused huge conflicts for Mozilla. On the one hand they come from the users’ side. On the other, they need to stay in business — and the only one around appears to be advertising. And the market there is beyond huge.

But so is abuse of users by the advertising industry. This is made plain by the popularity of Adblock Plus (Firefox and Chrome’s #1 add-on by a huge margin) and other instruments of prophylaxis against both advertising and tracking (e.g. Abine, Disconnect, Ghostery and Privowny, to name a few).

To align with this clear expression of market demand, Mozilla made moves in February 2013 to block third party cookies (which Apple’s Safari, which doesn’t depend on advertising, does by default). The IAB (Interactive Advertising Bureau) split a gut, and began playing hardball. Some links:

That last item — an extensive bill of particulars — featured this sidebar:

The link goes to An Open Letter to the Mozilla Corporation.

So Mozilla looked for common ground, and they found it on the advertising side, with personalization. Near as I can tell, this  began in May 2013, with Jay Sullivan‘s Personalization With Respect post. In July, Justin Scott, then a Product Manager at Mozilla Labs, vetted A User Personalization Proposal for Firefox. The post was full of language straight out of the ad industry songbook: “favorite brands,” “personalized experience,” “increased engagement,” “stronger loyalty.” Blowback in the comments was fierce:


I don’t care what publishers want, or that they really like this new scheme to increase their marketing revenue. Don’t add more tracking.

I’m beginning to realize that Mozilla is working to make Firefox as attractive to publishers as possible, while forgetting that those eyeballs looking at their ads could be attached to people who don’t want to be targeted. Stop it. Remember your roots as a “we’ll take Mozilla’s code, and make a great thing with it”, and not as “Google pays us to be on the default toolbar”.

Dragonic Overlord:

Absolutely terrible idea.

The last thing the internet needs is more “personalization” (read: “invasion of my privacy”). All your marketing jargon does nothing to hide the fact that this is just another tool to allow advertisers, website owners, the NSA, and others to track users online habits and, despite any good intentions you might have, it’s rife with the potential for abuse.

Tracy Licklider:

Bad idea. I do not want it. I think you misstate the benefits of the Internet. One of the most salient benefits of the Internet is for web sites, advertisers, and ISPs who are able to build dossiers about individuals’ private lives/data, generally without most users being aware of the possibility and generally without the users’ consent.

One of the main reasons Firefox has succeeded is that it, unlike all the other browsers, was dedicated to users unfettered, secure, and as private as possible use of the Internet.


If this “feature” becomes part of FireFox you’ll loose many users, if we wanted Chrome like browser we wouldn’t have chosen FireFox. We chose FireFox because it was DIFFERENT FROM Chrome but lately all I see is changes that make it similar and now you want to put spyware inside? Thanks but no thanks.

A follow-up post in July, by Harvey Anderson, Senior VP Business and Legal Affairs at Mozilla, was titled Up With People, and laid on even more of the same jive, this time without comments. In December Justin posted User Personalization Update, again with no comments.

Then in February, Darren Herman, Mozilla’s VP Content Services, posted Publisher Transformation With Users at the Center, introducing two new programs.  One was User Personalization. (Darren’s link goes Justin’s July piece.) The other was something called “directory tiles” that will appear on Firefox’s start page. He wasn’t explicit about selling ads in the tiles, but the implication was clear, both from blowback in the comments and from coverage in other media.

Said Reuters, ”Mozilla, the company behind the Firefox Internet browser, will start selling ads as it tries to grab a larger slice of the fast-expanding online advertising market.”

Romain Dillet in TechCrunch wrote, ”For the last couple of years, Mozilla and the advertising industry have been at odds. The foundation created the do-not-track feature to prevent targeted advertising. When users opt in, the browser won’t accept third party cookies anymore, making it much harder to display targeted ads around the web. Last year, Mozilla even chose to automatically block third-party cookies from websites that you hadn’t visited. Now, Mozilla wants to play ball with advertisers.”

The faithful didn’t like it. In Daring Fireball, John Gruber wrote, ”What a pile of obtuse horseshit. If you want to sell ads, sell ads. Own it. Don’t try to coat it with a layer of frosting and tell me it’s a fucking cupcake.”

Then Mitchell issued a corrective blog post, titled Content, Ads, Caution. Here’s an excerpt:

When we have ideas about how content might be useful to people, we look at whether there is a revenue possibility, and if that would annoy people or bring something potentially useful.  Ads in search turn out to be useful.  The gist  of the Tiles idea is that we would include something like 9 Tiles on a page, and that 2 or 3 of them would be sponsored — aka “ads.”  So to explicitly address the question of whether sponsored tiles (aka “ads”) could be included as part of a content offering, the answer is yes.

These sponsored results/ ads would not have tracking features.

Why would we include any sponsored results?  If the Tiles are useful to people then we’ll generate value.  That generates revenue that supports the Mozilla project.   So to explicitly address the question of whether we care about generating revenue and sustaining Mozilla’s work, the answer is yes.  In fact, many of us feel responsible to do exactly this.

Clearly Mozilla equates producing revenue with advertising, and intends to continue down a path that many of its most passionate users don’t like. This position is easy to rationalize, given‘s business model and need to stay alive.

By becoming an advertising company (in addition to everything else it is), Mozilla now experiences a problem that has plagued ad-supported media for the duration: its customers and consumers are different populations. I saw it in when I worked in commercial broadcasting, and I see it today in the online world with Google, Facebook, Twitter… and Mozilla. The customers (or at least the main ones) are either advertisers or proxies for them (Google in Mozilla’s case). The consumers are you and me.

The difference with Mozilla is that it didn’t start out as an advertising company. So becoming one involves a change of nature — a kind of Breaking Bad.

It hurts knowing that Mozilla is the only browser-maker that comes from our side, and wants to stay here, and treat us right. Apple clearly cares about customers (witness the success of their stores, and customer service that beats all the competition’s), but its browser, Safari, is essentially a checkbox item. Same goes for Microsoft, with Explorer. Both are theirs, not ours. Opera means well, but it’s deep in fifth place, with a low single-digit market share. Google’s Chrome is a good browser, but also built to support Google’s advertising-based business model. But only Mozilla has been with us from the start. And now here they are, trying their best not to talk like they’ve been body-snatched by the IAB.

And it’s worse than just that.

In addition to the Brendan Eich mess, Mozilla is coping with losing three of its six board members (who left before Brendan resigned). Firefox’s market share is also declining: from 20.63% in May 2013 to 17.68% in February 2014, according to (Other numbers here.)

Is it just a coincidence that May 2013 is also when Jay Sullivan made that first post, essentially announcing Mozilla’s new direction, toward helping the online advertising industry? Possibly. But that’s not what matters.

What matters is that Mozilla needs to come back  home: to Earth, where people live, and where the market is a helluva lot bigger than just advertising. I see several exciting paths for getting back. Here goes.

1) Offer a choice of browsers.

Keep Firefox free and evolving around an advertising-driven model.

And introduce a new one, built on the same open source code base, but fully private, meaning that it’s the person’s own, to be configured any way they please — including many new ways not even thinkable for a browser built to work for advertisers. Let’s call this new browser PrivateFox. (Amazingly, was an available domain name until I bought it last night. I’ll be glad to donate it to Mozilla.)

Information wants to be free, but value wants to be paid for. Since PrivateFox would have serious value for individuals, it would have a price tag. Paying for PrivateFox would make individuals actual customers rather than just “users,” “consumers,” “targets” and an “audience.” Mozilla could either make the payment voluntary, as with public radio and shareware, or it could make the browser a subscription purchase. That issue matters far less than the vast new market opportunities that open when the customer is truly in charge: something we haven’t experienced in the nineteen years that have passed since the first commercial websites went up.

PrivateFox would have privacy by design from the start: not just in the sense of protecting people from unwelcome surveillance; but in the same way we are private when we walk about the marketplace in the physical world. We would have the digital equivalent of clothing to hide the private parts of our virtual bodies. We would also be anonymous by default — yet equipped with wallets, purses, and other instruments for engagement with the sellers of the world.

With PrivateFox, we will be able to engage all friendly sites and sellers in ways that we choose, and on terms of our choosing as well. (Some of those terms might actually be more friendly than those one-sided non-agreements we submit to all the time without reading. For more on what can be done on the legal front, read this.)

(Yes, I know that Netscape failed at trying to charge for its browser way back in the early days. But  times were different. What was a mistake back then could be a smart move today.)

2) Crowdsource direct funding from individuals.

That’s a tall order — several hundred million dollars’ worth — but hey, maybe it can be done. I’d love to see an IndieGoGo (or equivalent) campaign for “PrivateFox: The World’s First Fully Private Browser. Goal: $300 million.”

3) Build intentcasting into Firefox as it stands.

Scott Adams (of Dilbert fame) calls it “broadcast shopping”. He explains:

Shopping is broken. In the fifties, if you wanted to buy a toaster, you only had a few practical choices. Maybe you went to the nearest department store and selected from the three models available. Or maybe you found your toaster in the Sears catalog. In a way, you were the hunter, and the toaster was the prey. You knew approximately where it was located, and you tracked it down and bagged it. Toasters couldn’t hide from you.

Now you shop on the Internet, and you can buy from anywhere on the planet. The options for any particular purchase approach infinity, or so it seems. Google is nearly worthless when shopping for items that don’t involve technology. It is as if the Internet has become a dense forest where your desired purchases can easily hide.

Advertising is broken too, because there are too many products battling for too little consumer attention. So ads can’t hope to close the can’t-find-what-I-want gap.

The standard shopping model needs to be reversed. Instead of the shopper acting as hunter, and the product hiding as prey, you should be able to describe in your own words what sort of thing you are looking for, and the vendors should use those footprints to hunt you down and make their pitch.

There are many ways of doing this. More than a dozen appear under “Intentcasting” in this list of VRM developers. Some are under wraps, but have huge potential.

Intentcasting sets a population comprised of 100% qualified leads loose in the marketplace, all qualifying their lead-ness on their own terms. This will be hugely disruptive to the all-guesswork business that cherishes a 1% click-through rate in “impressions” that mostly aren’t — and ignores the huge negative externalities generated by a 99+% failure rate. It will also generate huge revenues, directly.

This would be a positive, wealth-creating move that should make everybody (other than advertising mill-keepers) happy. Even advertisers.  Trust me: I know. I co-founded and served as Creative Director for Hodskins Simone & Searls, one of Silicon Valley’s top ad agencies for the better part of two decades. Consider this fact: No company that advertises defines themselves as “an advertiser.” They have other businesses. Advertising might be valuable to them, but it’s still just a line item on the expense side of the balance sheet. They can cut or kill it any time they want.

“Buy on the sound of cannons, sell on the sound of trumpets,” Lord Nathan Rothschild said. For the last few years advertising has been one giant horn section, blasting away. If online advertising isn’t a bubble (which I believe it is), it at least qualifies as a mania. And it is the nature of manias to pass.

Business-wise, investing in an advertising strategy isn’t a bad bet for Mozilla right now. But the downsides are real and painful. Mozilla can reduce that pain by placing other bets: ones on the demand side of the marketplace, and not just — like everybody else — on the supply side.

Here on Earth we have a landing site for Mozilla, where the above and many other ideas can be vetted and hashed out with the core constituency: IIW, the Internet Identity Workshop. It’s an inexpensive three-day unconference that runs twice every year in the heart of Silicon Valley, at the Computer History Museum: an amazing venue.

Phil Windley, Kaliya Hamlin and I have been putting on IIW since 2005. We’ve done seventeen so far, and it’s impossible to calculate how far sessions there have moved forward the topics that come up, all vetted and led by participants.

Here’s one topic I promise to raise on Day One: How can we help Mozilla? Lots of Mozilla folk have been at IIWs in the past. This time participating will have more leverage than ever.

I want to see lots of lizards and lizard-helpers there.


by Doc Searls at April 13, 2014 12:22 AM

April 12, 2014


rtorrent: Needs no introduction … again

Since I’m on to tools that everyone knows and are quite popular, I might as well throw rtorrent into the mix.


By most accounts it’s the program that I had no real part in developing, but seemed to bring me a lot of attention. It’s hardly fair, and I should probably apologize for riding its coattails.

But the strange part is, nearly a decade later, it’s still the smartest, leanest, sharpest torrent client for the console there is, and rivals a lot of graphical ones too.

In my lowly opinion, of course. ;)

So again, I won’t waste your time by fawning over rtorrent ad nauseum. And I won’t waste my time writing about something that I’ve fawned over ad nauseum already. Again, and again, and again. … :shock:

Let’s just assume you know about it, and its endless progeny, and you’re also a fan. Next, please. …

Tagged: client, download, manager, torrent

by K.Mandla at April 12, 2014 11:45 PM

rsync: Needs no introduction

I don’t think there’s much I can say about rsync that isn’t already common knowledge or preaching to the choir.

kmandla@6m47421: ~/downloads$ rsync -ah --progress source/ destination/
sending incremental file list
            925 100%    0.00kB/s    0:00:00 (xfr#1, to-chk=9/11)
            835 100%  815.43kB/s    0:00:00 (xfr#2, to-chk=8/11)
            892 100%  871.09kB/s    0:00:00 (xfr#3, to-chk=7/11)
            901 100%  879.88kB/s    0:00:00 (xfr#4, to-chk=6/11)
            893 100%  872.07kB/s    0:00:00 (xfr#5, to-chk=5/11)
            900 100%  878.91kB/s    0:00:00 (xfr#6, to-chk=4/11)
            886 100%  865.23kB/s    0:00:00 (xfr#7, to-chk=3/11)
            832 100%  812.50kB/s    0:00:00 (xfr#8, to-chk=2/11)
            883 100%  862.30kB/s    0:00:00 (xfr#9, to-chk=1/11)
            888 100%  433.59kB/s    0:00:00 (xfr#10, to-chk=0/11)

kmandla@6m47421: ~/downloads$ 

rsync is, was, and has been one of my favorite tools for a very long time, and short of single-file, one target copies, it’s the one thing I use to copy, backup, synchronize or just plain double-check.

rsync works across networks, across directories and within file trees. It gives clean progress indicators, can run completely silent, can delete files that aren’t in the source folder, and will avoid updating files that don’t exist in the destination. Just tell it what you want.

I think that will do for now. Like I said at the start, if you know it, there’s no point in me gloating over it. And if you don’t … waste no time in trying it out. ;)

Tagged: backup, copy, sync, synchronize

by K.Mandla at April 12, 2014 10:43 PM

sacha chua :: living an awesome life

Monthly review: March 2014

Last month, I:

  • had fun with Emacs
    • coded numerous little Emacs conveniences
    • learned how to make graphs in Org Mode: see
    • integrated Emacs Org Mode with Quantified Awesome
    • helped lots of people with Emacs
    • started the Emacs Basics video series
    • set up more Emacs chats
  • and geeked around with other things
    • started playing around with the Raspberry Pi, motion detection, and image processing with simplecv
    • learned more about NodeJS
    • upgraded to Ubuntu Precise, Ruby 2.0
    • went to Gamfternoon at Hacklab
  • drew a little
    • finally updated my Twitter background
    • lined up another sketchnoting gig
    • put together the print version of Sketchnotes 2013, yay LaTeX!
  • and took care of other stuff
    • filed our taxes
    • delegated more writing

In other news, I really like the new monthly review code I’ve added to Emacs:

Here’s the snippet:

(defun sacha/org-review-month (start-date)
  "Review the month's clocked tasks and time."
  (interactive (list (org-read-date)))
  ;; Set to the beginning of the month
  (setq start-date (concat (substring start-date 0 8) "01"))
  (let ((org-agenda-show-log t)
        (org-agenda-start-with-log-mode t)
        (org-agenda-start-with-clockreport-mode t)
        (org-agenda-clockreport-parameter-plist '(:link t :maxlevel 3)))
    (org-agenda-list nil start-date 'month)))

In April, I want to:

  • Record and set up more Emacs chats
  • Make open source contribution part of my routine (mailing lists, patches, sharing)

Blog posts

The post Monthly review: March 2014 appeared first on sacha chua :: living an awesome life.

by Sacha Chua at April 12, 2014 09:58 PM


The Error of Cowards

The sole philosophy open to those who doubt the possibility of truth is absolute silence--even mental. That is to say, as Aristotle points out, such men must make themselves vegetables. No doubt reason often errs, especially in the highest matters, and, as Cicero said long ago, there is no nonsense in the world which has not found some philosopher to maintain it, so difficult is it to attain truth. But it is the error of cowards to mistake a difficulty for an impossibility.

Jacques Maritain, An Introduction to Philosophy, Sheed and Ward (New York: 1933) p. 181.

by Brandon ( at April 12, 2014 09:45 PM

Colin Walters

GNOME West Coast Summit end

The West Coast Summit 2014 is over now, and I’m glad I was able to attend. There’s absolutely no substitute for getting a distributed group of people together for face to face conversations about their common interest in GNOME. Thanks to Endless Mobile for providing their office as a venue and sponsoring the event!

It was really great to see familiar faces like Germán, Giovanni, and Kristian (among many others!). Breakout sessions on topics like GNOME on Wayland and Gjs were very successful. It was cool to see GNOME on Wayland (well, it looked the same actually which was the goal ;) ). Giovanni did an amazing amount of work on investigating the Spidermonkey GC. Christian wowed people with a demo of Builder. I worked on Continuous and OSTree. In particular, on the OSTree branch for static deltas, which should significantly speed up downloads.

See also posts from Sri and Matthias.

by Colin Walters at April 12, 2014 09:10 PM

Text Patterns

Peter Enns and the problem of boundaries

I just came across this 2013 post by Peter Enns:

I’ve had far too many conversations over the last few years with trained, experienced, and practicing biblical scholars, young, middle aged, and near retirement, working in Evangelical institutions, trying to follow Jesus and use their brains and training to help students navigate the challenging world of biblical interpretation.

And they are dying inside.  

Just two weeks ago I had the latest in my list of long conversations with a well-known, published, respected biblical scholar, who is under inhuman stress trying to negotiate the line between institutional expectations and academic integrity. His gifts are being squandered. He is questioning his vocation. His family is suffering. He does not know where to turn.

I wish this were an isolated incident, but it’s not.  

I wish these stories could be told, but without the names attached, they are worthless. I wish I had kept a list, but even if I had, it wouldn’t have done anyone much good. I couldn’t have used it. Good people would lose their jobs.

I’m getting tired of hearing the same old story again and again. This is madness.

Enns is right that this kind of story is all too common, and all too sad. I’ve known, and talked to, and counseled, and prayed with, a number of such people over the years, and they’re not all in Biblical Studies either. But here’s the thing: I have also talked to an equal or greater number of equally distressed Christian scholars whose problem is that they teach in secular institutions where they cannot express their religious convictions — in the classroom or in their scholarship — without being turned down for tenure or promotion, or (if they are contingent faculty or pre-tenure) simply being dismissed. Odd that Enns shows no awareness of this situation.

I think he doesn't because he wants to present as a pathology of evangelicalism what is more generally and seriously a pathology of the academic job market: people feeling intimidated or utterly silenced because if they lose their professorial position they know they stand almost no chance of getting another one. Moreover, this isn’t a strictly academic issue either: people all over the world and in all walks of life feel this way about their jobs, afraid of losing them but troubled by their consciences about some aspect of their workplace. But I think these feelings are especially intense among American academics because of the number of people who can’t imagine themselves doing anything other than being a professor — and also because of the peculiar forms of closure in the most “open” academic environments.

As Stanley Fish wrote some years ago in an essay called “Vicki Frost Objects”,

What, after all, is the difference between a sectarian school which disallows challenges to the divinity of Christ and a so-called nonideological school which disallows discussion of the same question? In both contexts something goes without saying and something else cannot be said (Christ is not God or he is). There is of course a difference, not however between a closed environment and an open one but between environments that are differently closed.

So if we’re going to have compassion for academics feeling trapped in institutions that are uncongenial to their beliefs, let’s be ecumenical about it.

Moreover, I can’t tell from his post exactly what Enns thinks should be done about the situation, even within the evangelical context. If he thinks that all that Christian colleges and seminaries have to do is to relax their theological statements — well, that would be grossly naïve. No matter how tightly or loosely a religious institution defines itself, there will always be people on the boundaries, edge cases who will feel uncomfortable at best or coerced into submission at worst. And if, like the modern university, an institution insists that it has no such limitations on membership at all, then that will simply mean, as Fish makes clear, that the boundaries are there but unstated and invisible — until you cross them.

by Alan Jacobs ( at April 12, 2014 08:23 PM


Extracurricular Activities — April 12, 2014

Some Scientists Say Papyrus Referring to Jesus’ Wife Is More Likely Ancient Than Fake

A faded fragment of papyrus known as the “Gospel of Jesus’s Wife,” which caused an uproar when unveiled by a Harvard Divinity School historian in 2012, has been tested by scientists who conclude in a journal published on Thursday that the ink and papyrus are very likely ancient, and not a modern forgery.

Nicholas Perrin Interviewed on Importance of Jesus' Wife Papyrus Dating

Both the 2012 announcement and yesterday's drew headlines worldwide—far more attention than other manuscript fragments purportedly from the fourth to eighth centuries. Should we care? Does this tell us anything about Jesus or early Christianity? We asked Nicholas Perrin...

Can someone, on the basis of this fragment, say, "A-ha! So now we know Jesus was married"?

The Church Needs Philosophers and Philosophers Need the Church 

A widely held misconception about the discipline of philosophy and those of us who like to think of ourselves as philosophers: philosophy provides no worldly good, no non-cognitive benefit, and is of limited value. Those of us who have committed the double sin of being a Christian and a philosopher risk further marginalization, often viewed with suspicion by the church as well... As we navigate an increasingly pragmatic university setting and the suspicious gaze of the church, it is easy to feel—like a severed hand—a bit homeless. But before you pass the hemlock, I plead my case: the church needs philosophers and philosophers need the church.

Tim Challies Gives 2 Proposals to the Question "How Many People Go To Your Church?"

I’d like to make the same two-part proposal I made a few years back: Let’s stop asking, “How many people go to your church?” And when someone asks us that question, let’s not feel obliged to give a direct answer.

David Crabb at Desiring God Reflects on "Bible-Balance in Christian Ministry"

You don’t hear a lot about it in seminary. It doesn’t get much discussion in pastoral theology books. But one of the more complex challenges of Christian leadership is cultivating a Bible-balanced ministry. What does it mean for a church, or a ministry, to be Bible-balanced? Why is it important?


Extra-Curricular Activities is a weekly roundup of stories on biblical interpretation, theology, and issues where faith and culture meet. We found each story interesting, thought-provoking, challenging, or useful in some way – but we don't necessarily agree with or endorse every point in every story.

If you have any comments on these stories, we welcome you to share them here. We hope you enjoy!

–The Editors of Koinonia Blog


by Jeremy Bouma at April 12, 2014 08:04 PM

assertTrue( )

Frameshift errors in leprosy bacterium DNA

Shocking as it might sound, leprosy continues to strike over 200,000 persons per year worldwide, making it as much of a health problem as cholera or yellow fever. One of the oldest known infectious diseases, leprosy became the first disease to be causally linked to bacteria when Hansen made his famous discovery of the connection to Mycobacterium leprae in 1873. Ever since then, scientists have been trying to grow M leprae in the lab, to no avail. Like most environmental isolates, M. leprae defies attempts at pure culture. The only way to grow it in the lab is to infect mice or armadillos, where it has a doubling time of 14 days, the longest known generation time of any bacterium.

Traditionally, it has been assumed that the difficulty in growing M. leprae in pure culture is due to the organism's complex nutritional requirements. (In humans, the organism is an obligate intracellular parasite that takes up residency in the Schwann cells of the peripheral nervous system.) There is no doubt considerable truth to this assumption, but the reason for the organism's fastidious nutritional requirements wasn't fully known until Cole et al. (2001) showed that half the bacterium's genome is inoperative and undergoing decay. Genomic sequencing revealed that M. leprae has only three quarters the DNA content of its (quite robust) cousin, M. tuberculosis, and of M. leprae's 3,000-or-so remaining genes, only 1,600 are fully functional. The rest are pseudogenes.

Pseudogenes are genes that have become inactivated through loss of start codons, loss of promoter regions, introduction of spurious stop codons, introduction of frameshift errors, or through other causes. Almost all organisms contain pseudogenes in their DNA. (Human DNA reportedly contains over 12,000 pseudogenes.) The leprosy bacterium, however, is unique in having approximately half its genome tied up in pseudogenes. Once a gene becomes a pseudogene, it is effectively useless baggage ("junk DNA") and continues on a long path of deterioration. Evolutionary theory predicts that such genes will eventually be lost from the genome, since the carrying cost of keeping them puts the organism at a disadvantage, energetically. But the curious thing about M. leprae is that it's a hoarder: It not only holds onto its useless genes, it actually transcribes upwards of 40% of them. In fact, a recent study of 1000-year-old M. leprae DNA (recovered from medieval skeletons), comparing the medieval version of the organism's genome with the genome of today's M. leprae, found that pseudogenes are highly conserved in the bacterium.

The fact that the bacterium actually transcribes many of its pseudogenes (and doesn't lose them over time) is striking, to say the least, and suggests that the transcription of certain genes or pseudogenes is resulting in mRNAs that silence other, more deleterious genes.  It could be that M. leprae can't be grown in culture because when certain combinations of nutrients are presented to it, the nutrients up-regulate deleterious nonsense genes in otherwise-normal operons (or down-regulate important silencers), directly or indirectly. (Williams et al. found that many M. leprae pseudogenes are located in the middle of operons and are transcribed via fortuitous read-through.) Various scenarios are possible. Much work remains to be done.

In the meantime, I couldn't help doing a little desktop science to characterize M. leprae's "defective genes" problem further. I went to and entered "Mycobacterium leprae Br4923" in the Organism Name field. In the Genome Information box, if you click the "Click for Features" link, you can see that 1604 genes are labeled "CDS" (meaning, these are the operative, non-defective genes) while a separate line item shows an utterly astounding 2233 genes as pseudogenes. (Addendum: The FASTA file at contains duplicates. The actual pseudogene count, it turns out, is 1116, not 2233. But still, 1116 is a huge number of pseudogenes.) The "DNA Seqs" links on the right side of that page allow you to download the FASTA sequences for the respective gene groupings. These are simple text files containing the base sequences (A, T, G, and C) for the coding strands of the genes.

I wrote a few lines of JavaScript to analyze the base compositions of the genes (and pseudogenes), and what I noticed immediately is that the base composition differs for the two groups:

Base Content (Genes) Content (Pseudogenes)

The G+C content for the "normal" genes averages 60.6%, whereas for the pseudogenes it's 55.4%. A typical G+C value for other members of the genus Mycobacterium is 65%. Thus, it's clear that not only the pseudogenes but the "normal" genes of M. leprae have drifted in the direction of more A+T. This has been noted before (by Cole et al. and others). What's perhaps less obvious is that purine content (A+G) has shifted from 50.5% in the normal genes to 49.8% in the pseudogenes. Bear in mind we're looking at data for one strand of DNA: the so-called coding or "message" strand.

Clearly, there is a tendency for pseudogenes to "regress to the mean." But the shift in purine concentration is particularly interesting, because it indicates that purine usage in normal-gene coding regions is perhaps non-randomly elevated. The shift from 50.5% to 49.8% in A+G content may not seem particularly striking on its own, but the difference, it turns out, is highly significant. You can see why in the following graph.

Base composition of "normal" genes in M. leprae (total purines vs. G+C) by codon base position. (n=1604) Red dots are for base one, gold dots are for base two, blue dots are for base 3 (the "wobble" base). Click to enlarge. See text for discussion.

To make this graph, I looked at the DNA of the coding regions of "normal" genes and determined the average purine content as well as the G+C content for positions one, two, and three of all codons. As you can see, the purine content (relative to the G+C content) segregates non-randomly according to codon base position. The red dots represent base one, the gold (or brown) dots represent base two, and the blue dots represent base three (often called the "wobble" base, for historical reasons). Not unexpectedly, the greatest G+C shift occurs in base three (as is usually the case). What's perhaps more surprising is the clear preference for purines in base one. The red cluster centers at y = 0.6051 plus or minus 0.0467 (standard deviation). This means that on average, position one of a codon is occupied by a purine (A or G) over 60% of the time. This is actually quite typical of codons in most organisms. I've looked at over 1,300 bacterial species so far, and in all of them, purines accumulate at codon base one. (Maybe in a future post, I'll present more data to this effect.)

Base two segregates out as having a G+C content significantly below the organism's total-genome G+C content and centers on y = 0.4434 (median) plus or minus 0.0547 (SD).

Now compare the above graph with a similar graph for M. leprae's pseudogenes:

Base composition of M. leprae pseudogenes by codon position. (n=2233) Again, red dots are for base one, gold are for base two, blue are for base three. Click to enlarge. See text for discussion.

Here, it's evident that base compositions for all three codon positions overlap significantly. The fact that the codon positions are no longer clearly defined in their spatial representation on this graph is consistent with widespread frameshift mutations in the DNA, causing bases that would normally be in position one (or two or three) to be in some other position, randomly.

Hence we can say, with some confidence, on the basis of these graphs, that many (if not most) of the "junk genes" in M. leprae harbor frameshift mutations. The question of which came first—frameshift mutations, or silencing of genes (followed by frameshifts)—is still open. But we know for certain frameshifts are indeed rampant in the M. leprae pseudogenome.

Exactly how or why M. leprae accumulated so many frameshift mutations (and then kept hoarding the mutated genes) is unknown. As I said earlier, much work remains to be done.

Note: Graphs were produced using the excellent service at Hand-editing of SVG graphs (before conversion to PNG) enabled easy modification of the data-point colors in a text editor. Data points were plotted with opacity = 0.30 so that areas of high overlap are more apparent visually (with the piling of data points on top of data points).

Bioinformaticists (and others!), feel free to leave a comment below.

by Kas Thomas ( at April 12, 2014 07:53 PM

A binary signal in the second codon base

In looking at base composition statistics of codons, an amazing fact jumps out.

If you look carefully at the following graph, you can see that the cloud of gold-colored data points (representing the compositional stats for the second base in codons of Clostridium botulinum) has a second, "breakaway" cloud underneath it. (See arrow.)

Codon base composition statistics for Clostridium botulinum. Notice the breakaway cloud of gold points under the main cloud (arrow). These points represent genes in which most codons have a pyrimidine in the middle base of each codon.

To review: I made this plot by going through each DNA sequence for each coding ("CDS") gene of C. botulinum, and for each gene, I went through all codons and calculated the average purine content (as well as the average G+C content) of the bases at a positions one, two, and three in the codons. Thus, every dot represents the stats for one gene's worth of data.

After looking at graphs of this sort, three key facts about codon bases leap out:
  • Most codons, for most genes, have a purine as the first base (notice how the red cloud of points is higher than the others, centering on y=0.7).
  • The third base (often called the "wobble" base; shown blue here) has the most extreme G+C value. (This is well known.)
  • The middle base falls in one of two positions (high or low) on the purine (y) axis. There's a primary cloud of data points and a secondary cloud in a distinct region below the main cloud. The secondary cloud of gold points is centered at about 0.3 on the y-axis, meaning these are genes in which the second codon base tends to be a pyrimidine
The question is: What does it mean when you look at a gene with 200 or 300 or 400 codons, and the majority of codons have a pyrimidine in the second base?

If you examine the standard codon translation table (below), you can see that codons with a pyrimidine in the second position (represented by the first two columns of the table) code primarily for nonpolar amino acids. When a pyrimidine is in the second base, the possible amino acids are phenylalanine, serine, leucine, proline, isoleucine, methionine, threonine, valine, and alanine. Of these, all but serine and threonine are nonpolar. Therefore, a pyrimidine in position two of a codon means there's at least a 75% chance that the amino acid will have a nonpolar, hydrophobic side group.

Virtually all proteins contain some nonpolar amino acids, but when a protein contains mostly nonpolar amino acids, that protein is destined to wind up in the cell membrane (which is largely made up of lipids). Thus, we can expect to find that genes in the "breakaway" cloud of gold points in the graph further above represent membrane-associated proteins.

To check this hypothesis, I wrote a script that harvested the gene names, and protein-product names, of all the "breakaway cloud" data points. After purging genes annotated (unhelpfully) as "hypothetical protein," I was left with 37 "known" genes. They're shown in the following table.

Gene Product
CLC_0571 arsenical pump family protein
CLC_1058 L-lactate permease
CLC_1550 carbohydrate ABC transporter permease
CLC_3115 xanthine/uracil permease family protein
CLC_0813 arsenical-resistance protein
CLC_1018 putative anion ABC transporter, permease protein
CLC_1687 xanthine/uracil permease family protein
CLC_3633 sporulation integral membrane protein YtvI
CLC_2382 phosphate ABC transporter, permease protein PstA
CLC_0189 ZIP transporter family protein
CLC_3351 sodium:dicarboxylate symporter family protein
CLC_0971 cobalt transport protein CbiM
CLC_1534 methionine ABC transporter permease
CLC_2798 xanthine/uracil permease family protein
CLC_1397 manganese/zinc/iron chelate ABC transporter permease
CLC_1836 stage III sporulation protein AD
CLC_0528 high-affinity branched-chain amino acid ABC transporter, permease protein
CLC_0430 electron transport complex, RnfABCDGE type, A subunit
CLC_2523 flagellar biosynthesis protein FliP
CLC_0401 amino acid permease family protein
CLC_0383 lrgB-like family protein
CLC_0457 chromate transporter protein
CLC_0291 sodium:dicarboxylate symporter family protein
CLC_0427 electron transport complex, RnfABCDGE type, D subunit
CLC_1281 putative transcriptional regulator
CLC_2008 ABC transporter, permease protein
CLC_0868 branched-chain amino acid transport system II carrier protein
CLC_1237 monovalent cation:proton antiporter-2 (CPA2) family protein
CLC_1137 methionine ABC transporter permease
CLC_0764 putative drug resistance ABC-2 type transporter, permease protein
CLC_1953 xanthine/uracil permease family protein
CLC_2444 auxin efflux carrier family protein
CLC_0897 putative ABC transporter, permease protein
CLC_1555 C4-dicarboxylate transporter/malic acid transport protein
CLC_0374 xanthine/uracil permease family protein
CLC_0470 undecaprenyl pyrophosphate phosphatase
CLC_2648 monovalent cation:proton antiporter-2 (CPA2) family protein

Notice that with the exception of CLC_1281, a "putative transcriptional regulator," every gene product represents a membrane-associated protein: transporters, carrier proteins, permeases, etc.

I ran the same experiment on genes from Streptomyces griseus (strain XylbKG-1) and came up with 222 genes having high pyrimidine content in base two. All 222 genes specify membrane-associated proteins. (The full list is in a table below.)

The bottom line: Base two of codons acts as a binary switch. If the base is a pyrimidine, the associated amino acid will most likely (75% chance) be nonpolar. If the base is a purine, the codon will either be a stop codon (3 out of 32 codons) or the amino acid will be polar (26 out of 29 codons).

Here's the list of 222 genes from S. griseus in which the middle codon base is predominantly a pyrimidine:

SACT1_0608 ABC-type transporter, integral membrane subunit
SACT1_3730 major facilitator superfamily MFS_1
SACT1_4066 cation efflux protein
SACT1_5911 ABC-2 type transporter
SACT1_6577 SNARE associated protein
SACT1_6966 ABC-type transporter, integral membrane subunit
SACT1_7160 major facilitator superfamily MFS_1
SACT1_3151 NADH-ubiquinone/plastoquinone oxidoreductase chain 6
SACT1_3682 drug resistance transporter, EmrB/QacA subfamily
SACT1_5431 Citrate transporter
SACT1_3199 proton-translocating NADH-quinone oxidoreductase, chain M
SACT1_7301 putative integral membrane protein
SACT1_3198 NAD(P)H-quinone oxidoreductase subunit 2
SACT1_2008 arsenical-resistance protein
SACT1_3149 proton-translocating NADH-quinone oxidoreductase, chain L
SACT1_3967 MATE efflux family protein
SACT1_3148 proton-translocating NADH-quinone oxidoreductase, chain M
SACT1_5571 major facilitator superfamily MFS_1
SACT1_0651 major facilitator superfamily MFS_1
SACT1_2669 ABC-type transporter, integral membrane subunit
SACT1_1805 NADH dehydrogenase (quinone)
SACT1_3147 NAD(P)H-quinone oxidoreductase subunit 2
SACT1_6961 major facilitator superfamily MFS_1
SACT1_0992 ABC-2 type transporter
SACT1_2619 major facilitator superfamily MFS_1
SACT1_0507 major facilitator superfamily MFS_1
SACT1_0649 ABC-type transporter, integral membrane subunit
SACT1_0800 glycosyl transferase family 4
SACT1_1659 ABC-type transporter, integral membrane subunit
SACT1_1803 multiple resistance and pH regulation protein F
SACT1_4190 putative ABC transporter permease protein
SACT1_5522 drug resistance transporter, EmrB/QacA subfamily
SACT1_5568 drug resistance transporter, EmrB/QacA subfamily
SACT1_7248 Lysine exporter protein (LYSE/YGGA)
SACT1_0266 ABC-2 type transporter
SACT1_0847 Na+/solute symporter
SACT1_4378 ABC-type transporter, integral membrane subunit
SACT1_6766 ABC-type transporter, integral membrane subunit
SACT1_2522 putative integral membrane protein
SACT1_4762 amino acid permease-associated region
SACT1_4901 major facilitator superfamily MFS_1
SACT1_2616 multiple antibiotic resistance (MarC)-related protein
SACT1_3961 major facilitator superfamily MFS_1
SACT1_6236 MIP family channel protein
SACT1_1319 protein of unknown function UPF0016
SACT1_2332 copper resistance D domain protein
SACT1_5327 ABC-type transporter, integral membrane subunit
SACT1_5759 ABC-2 type transporter
SACT1_1133 ABC-type transporter, integral membrane subunit
SACT1_1562 ABC-type transporter, integral membrane subunit
SACT1_5518 major facilitator superfamily MFS_1
SACT1_3430 ABC-type transporter, integral membrane subunit
SACT1_4517 major facilitator superfamily MFS_1
SACT1_4565 drug resistance transporter, EmrB/QacA subfamily
SACT1_4994 ABC-type transporter, integral membrane subunit
SACT1_7197 major facilitator superfamily MFS_1
SACT1_5949 major facilitator superfamily MFS_1
SACT1_6233 protein of unknown function DUF6 transmembrane
SACT1_0936 ABC-type transporter, integral membrane subunit
SACT1_4993 2-aminoethylphosphonate ABC transporter, permease protein
SACT1_6954 ABC-type transporter, integral membrane subunit
SACT1_1846 Lysine exporter protein (LYSE/YGGA)
SACT1_3429 ABC-type transporter, integral membrane subunit
SACT1_3957 membrane protein of unknown function
SACT1_4612 ABC-2 type transporter
SACT1_1998 polar amino acid ABC transporter, inner membrane subunit
SACT1_2093 ABC-type transporter, integral membrane subunit
SACT1_4420 ABC-2 type transporter
SACT1_6613 putative ABC transporter permease protein
SACT1_2564 small multidrug resistance protein
SACT1_3669 major facilitator superfamily MFS_1
SACT1_4186 major facilitator superfamily MFS_1
SACT1_4850 ABC-type transporter, integral membrane subunit
SACT1_0206 major facilitator superfamily MFS_1
SACT1_5418 Lysine exporter protein (LYSE/YGGA)
SACT1_0548 ABC-type transporter, integral membrane subunit
SACT1_3332 protein of unknown function DUF6 transmembrane
SACT1_3764 sodium/hydrogen exchanger
SACT1_6278 protein of unknown function DUF6 transmembrane
SACT1_4143 ABC-2 type transporter
SACT1_3232 ABC-type transporter, integral membrane subunit
SACT1_0256 ABC-type transporter, integral membrane subunit
SACT1_2898 major facilitator superfamily MFS_1
SACT1_6510 protein of unknown function DUF6 transmembrane
SACT1_0980 CrcB-like protein
SACT1_1650 ABC-type transporter, integral membrane subunit
SACT1_4658 acyltransferase 3
SACT1_0306 protein of unknown function DUF6 transmembrane
SACT1_2372 major facilitator superfamily MFS_1
SACT1_7238 ABC-type transporter, integral membrane subunit
SACT1_0202 major facilitator superfamily MFS_1
SACT1_0591 ABC-type transporter, integral membrane subunit
SACT1_5369 protein of unknown function DUF81
SACT1_6227 C4-dicarboxylate transporter/malic acid transport protein
SACT1_6755 ABC-type transporter, integral membrane subunit
SACT1_0978 Urea transporter
SACT1_2418 ATP synthase subunit a
SACT1_5604 major facilitator superfamily MFS_1
SACT1_4891 ABC-type transporter, integral membrane subunit
SACT1_6507 protein of unknown function DUF6 transmembrane
SACT1_6754 ABC-type transporter, integral membrane subunit
SACT1_0787 ABC-2 type transporter
SACT1_2848 sodium/hydrogen exchanger
SACT1_0636 ABC-type transporter, integral membrane subunit
SACT1_1891 putative integral membrane protein
SACT1_1552 xanthine permease
SACT1_2894 putative secreted protein
SACT1_4508 sodium:dicarboxylate symporter
SACT1_7091 drug resistance transporter, EmrB/QacA subfamily
SACT1_2652 major facilitator superfamily MFS_1
SACT1_1741 amino acid permease-associated region
SACT1_1838 ABC-type transporter, integral membrane subunit
SACT1_2796 gluconate transporter
SACT1_5220 sodium/hydrogen exchanger
SACT1_6991 ABC-type transporter, integral membrane subunit
SACT1_3273 protein of unknown function DUF894 DitE
SACT1_7089 ABC-type transporter, integral membrane subunit
SACT1_7280 major facilitator superfamily MFS_1
SACT1_3467 major facilitator superfamily MFS_1
SACT1_1304 ABC-type transporter, integral membrane subunit
SACT1_6032 protein of unknown function DUF81
SACT1_4312 major facilitator superfamily MFS_1
SACT1_0876 ABC-type transporter, integral membrane subunit
SACT1_6123 citrate/H+ symporter, CitMHS family
SACT1_4359 Cl- channel voltage-gated family protein
SACT1_7325 branched-chain amino acid transport
SACT1_1160 protein of unknown function DUF140
SACT1_2265 Arsenical pump membrane protein
SACT1_3512 ABC-2 type transporter
SACT1_1018 major facilitator superfamily MFS_1
SACT1_3415 Xanthine/uracil/vitamin C permease
SACT1_5214 BioY protein
SACT1_3656 small multidrug resistance protein
SACT1_3895 SpdD2 protein
SACT1_4929 ABC-type transporter, integral membrane subunit
SACT1_3029 major facilitator superfamily MFS_1
SACT1_6312 ABC-type transporter, integral membrane subunit
SACT1_0919 L-lactate transport
SACT1_4356 ABC-2 type transporter
SACT1_0532 ABC-type transporter, integral membrane subunit
SACT1_6693 secretion protein snm4
SACT1_0967 ABC-type transporter, integral membrane subunit
SACT1_6496 major facilitator superfamily MFS_1
SACT1_6983 major facilitator superfamily permease
SACT1_0917 NADH-ubiquinone/plastoquinone oxidoreductase chain 3
SACT1_6887 major facilitator superfamily MFS_1
SACT1_2835 major facilitator superfamily MFS_1
SACT1_5544 drug resistance transporter, Bcr/CflA subfamily
SACT1_5591 ABC-type transporter, integral membrane subunit
SACT1_1201 ABC-type transporter, integral membrane subunit
SACT1_2404 ABC-2 type transporter
SACT1_0870 protein of unknown function DUF803
SACT1_6933 ABC-type transporter, integral membrane subunit
SACT1_1776 ABC-type transporter, integral membrane subunit
SACT1_3213 major facilitator superfamily MFS_1
SACT1_4210 phosphate ABC transporter, inner membrane subunit PstC
SACT1_5398 protein of unknown function DUF81
SACT1_0914 NADH-ubiquinone/plastoquinone oxidoreductase chain 6
SACT1_3261 major facilitator superfamily MFS_1
SACT1_6932 ABC-type transporter, integral membrane subunit
SACT1_0669 major facilitator superfamily MFS_1
SACT1_4255 ABC-2 type transporter
SACT1_4541 major facilitator superfamily MFS_1
SACT1_4638 major facilitator superfamily MFS_1
SACT1_0913 NADH-ubiquinone oxidoreductase chain 4L
SACT1_4443 protein of unknown function DUF81
SACT1_5396 Xanthine/uracil/vitamin C permease
SACT1_0912 NADH dehydrogenase (quinone)
SACT1_2924 Lysine exporter protein (LYSE/YGGA)
SACT1_5922 putative integral membrane protein
SACT1_1243 Bile acid:sodium symporter
SACT1_6967 ABC-type transporter, integral membrane subunit
SACT1_0911 proton-translocating NADH-quinone oxidoreductase, chain M
SACT1_3931 putative ABC transporter permease protein
SACT1_2820 ABC-2 type transporter
SACT1_3298 putative integral membrane transport protein
SACT1_4871 major facilitator superfamily MFS_1
SACT1_5873 major facilitator superfamily MFS_1
SACT1_6636 putative integral membrane protein
SACT1_0905 AbgT transporter
SACT1_5532 2-dehydro-3-deoxyphosphogluconate aldolase/4-hydroxy-2-oxoglutarate aldolase
SACT1_0910 proton-translocating NADH-quinone oxidoreductase, chain N
SACT1_2115 ABC-type transporter, integral membrane subunit
SACT1_4635 protein of unknown function DUF6 transmembrane
SACT1_5777 major facilitator superfamily MFS_1
SACT1_3979 major facilitator superfamily MFS_1
SACT1_5536 major facilitator superfamily MFS_1
SACT1_6782 major facilitator superfamily MFS_1
SACT1_0616 virulence factor MVIN family protein
SACT1_4869 ABC-type transporter, integral membrane subunit
SACT1_1581 putative integral membrane protein
SACT1_4585 major facilitator superfamily MFS_1
SACT1_6536 small multidrug resistance protein
SACT1_4024 cell cycle protein
SACT1_5296 major facilitator superfamily MFS_1
SACT1_1865 major facilitator superfamily MFS_1
SACT1_4868 ABC-type transporter, integral membrane subunit
SACT1_0955 protein of unknown function DUF803
SACT1_4296 major facilitator superfamily MFS_1
SACT1_5104 major facilitator superfamily MFS_1
SACT1_0519 Bile acid:sodium symporter
SACT1_2394 putative ABC transporter permease protein
SACT1_0661 major facilitator superfamily MFS_1
SACT1_2062 major facilitator superfamily MFS_1
SACT1_4295 ABC-type transporter, integral membrane subunit
SACT1_6828 peptidase M48 Ste24p
SACT1_3446 major facilitator superfamily MFS_1
SACT1_6631 Lysine exporter protein (LYSE/YGGA)
SACT1_1048 ABC-type transporter, integral membrane subunit
SACT1_1528 protein of unknown function DUF6 transmembrane
SACT1_2016 branched-chain amino acid transport
SACT1_6154 gluconate transporter
SACT1_5051 major facilitator superfamily MFS_1
SACT1_5531 protein of unknown function DUF81
SACT1_6480 protein of unknown function DUF1290
SACT1_0373 ABC-type transporter, integral membrane subunit
SACT1_2392 putative ABC transporter permease protein
SACT1_2724 protein of unknown function DUF107
SACT1_7257 protein of unknown function UPF0118
SACT1_2772 putative integral membrane protein
SACT1_3201 NAD(P)H-quinone oxidoreductase subunit 4L
SACT1_7066 ABC-type transporter, integral membrane subunit

Some of these genes are labeled "protein of unknown function," but I think we can predict with high confidence, based on what we know about these proteins (namely, that they're hydrophobic) that the gene products in question involve membrane-associated functions.

Bioinformatics geeks, leave a comment below.

by Kas Thomas ( at April 12, 2014 07:53 PM

Why do so many codons begin with a purine?

With the advent of sites like (where you can download genomes, create synteny graphs, run BLAST searches, and do all sorts of desktop bioinformatics), it's ridiculously easy for someone interested in comparative genomics to . . . well, compare genomes, for one thing. And if you look at enough gene sequences, a couple of things pop out.

One thing that pops out is that most codons, in most genes, begin with a purine (namely A or G: adenine or guanine). Also, codons typically show the greatest GC swing in base number three. These trends can be seen in the chart below, where I show average base composition (by codon position) for three well-studied organisms. For clarity, base-one purines are shown in bold and base-three G and C are shown highlighted in yellow.

Codon base
S. griseus
0.166 0.434 0.287 0.112
0.224 0.219 0.295 0.261
0.037 0.394 0.530 0.038
E. coli
0.256 0.343 0.238 0.161
0.291 0.181 0.222 0.304
0.186 0.285 0.261 0.265
C. botulinum
0.395 0.299 0.094 0.210
0.374 0.136 0.161 0.328
0.442 0.108 0.064 0.383

Streptomyces griseus is a common soil bacterium that happens to have very high genomic G+C content (72.1% overall, although you can see that in base three of codons the G+C content is more like 92%).

E. coli represents a middle-of-the-road organism in terms of G+C content (50.8% overall), while our ugly friend Clostridium botulinum (the soil organism that can ruin your whole day if it finds its way into a can of tomatoes) has very low genomic G+C content (around 28%).

Even though these organisms differ greatly in G+C content, they all illustrate the (universal) trend toward usage of purines (A or G) in the first position of a codon. Something like 59% to 69% of the time (depending on the organism), codons look like Pnn, where P is a purine base and 'n' represents any base. This is true for viruses as well as cellular genomes.

This pattern is so universal, one wonders why it exists. I think a credible, parsimonious explanation is that when protein-coding genes look like PnnPnnPnn... (etc.) it makes for a crisp reading frame. It's easy to see that a +1 frameshift results in a repeating nPn pattern and a +2 frameshift results in repeats of nnP. These are easily distinguished from Pnn.

There are benefits for a PnnPnnPnn... reading frame. In a previous post, I showed that when most of a gene's codons have a pyrimidine in base two, the resulting protein gets shipped to the cell membrane. (This is a simple consequence of the fact that codons with a pyrimidine in position two tend to code for hydrophobic, lipid-soluble amino acids.) Because a +1 reading-frame shift produces repeats of nPn, the Pnn "default" pattern means that +1 frameshifted gene products, if they occur, won't get shipped to the cell membrane. This is an extremely important outcome, because membrane proteins are, in general, highly transcribed and under strong selective pressure. In addition to specifying antigenic properties and determining phage resistance, membrane proteins make up proton pumps, secretion systems, symporters, kinases, flagellar components, and many other kinds of proteins. They determine the cell's "interface" to the world. They also maintain cell osmolarity and membrane redox potential. Messing with membrane proteins is bound to be risky. Much better to keep frameshifted nonsense proteins away from the membrane.

Fairly strong support for this notion (that Pnn codons provide a crisp reading frame) comes from studies of naturally occurring frameshift signals in DNA. We now know that in many organisms, certain "slippery" DNA signals (usually heptamers, like CCCTGAC) instruct the ribosome to change reading frames. (See, for example, "A Gripping Tale of Ribosomal Frameshifting: Extragenic Suppressors of Frameshift Mutations Spotlight P-Site Realignment," Atkins and Björk, Microbiol. Mol. Biol. Rev. 2009. Also, for fun, be sure to check out some of the papers on quadruplet decoding, which leaves room for alien life forms with 200 amino acids instead of 20.) The "slippery heptamer" frameshift signals that have thus far been identified tend to contain runs of pyrimidines.

Also tending to support the "Pnn = crisp reading frame" notion is the fact that stop codons (TGA, TAA, TAG) look like pPP (where 'p' is a pyrimidine and 'P' is a purine). Again, a crisp distinction.

As for why purines were chosen (and not pyrimidines) to begin the Xxx pattern, again I think a fairly parsimonious answer is available: ATP and GTP are the most abundant nucleoside-triphosphates in vivo. These are the energy sources for nucleic-acid and protein synthesis, respectively.

A prediction: If we run into an alien life form (in the oceans of Europa, say) and it turns out to be the case that UTP (instead of ATP) is the "universal energy molecule" in that life form's cells, then that life form's codons will probably begin with U and form Unn triplets (or Unnn quadruplets, perhaps) a higher-than-average percentage of the time.

by Kas Thomas ( at April 12, 2014 07:52 PM

Are dead genes still alive in the leprosy bacillus?

The genome of the leprosy bacterium (Mycobacterium leprae) stands as a remarkable example of DNA in an apparent state of massive, wholesale breakdown. Of the organism's 2720 genes, only 1604 appear to be functional, while 1116 are pseudogenes, which is to say genes that have been "turned off" and left for dead.

Genes can become pseudogenes in any number of ways, including loss of a start codon, loss of promoter regions (or degraded Shine Delgarno signals), random insertions and deletions, mutations that cause spurious stop codons, and so on. Once a gene gets "turned off," assuming loss of the gene in question isn't fatal, the gene typically undergoes a period of degradation (leading to its eventual loss from the genome), but that's not exactly what we see in the leprosy bacterium. When leprosy germs from medieval skeletons were sampled and their genomes sequenced, researchers found that pseudogenes in M. leprae haven't changed very much in the past thousand years or so. Not only does M. leprae tend to hold onto its pseudogenes, it actively transcribes upwards of 40% of them. Probably not all of the transcripts result in expressed proteins (many lack a start codon!), but some no doubt do get translated into proteins. Let's put it this way: It would be extremely unusual for an organism to conserve this many pseudogenes if none of them was doing anything useful.

This view of a segment of the two genomes shows how a region of around 80,000 base pairs in M. tuberculosis maps to a similar 68,000-base-pair region of M. leprae. Notice that in the lowermost panel (representing M. leprae), many genes are shown as shrunken silver segments instead of fat green cylinders. The smaller grey/silver segments are pseudogenes. Click to enlarge.

To get a better idea of what's going on here, I downloaded the DNA sequences of M. leprae's 1604 "normal" genes as well as the 1116 pseudogenes. In analyzing the codons for these genes, I looked for signs of genes that were still in the normal reading frame. One way to detect this is by measuring the purine content at the various base positions in a gene's codons. In a typical protein-coding gene, around 60% of codons begin with A or G (adenine or guanine). This positional bias will, of course, be lost in a gene that has undergone frameshift mutations. Among M. leprae's 1116 pseudogenes, I found 269 in which codons showed an average AG1 percentage (A+G content, codon base one) of 55% or more. These are pseudogenes that appear to still be mostly "in frame."

Things get a lot more interesting where putative membrane proteins are concerned. In a previous post, I showed that in some genes, the second codon base is pyrimidine-rich (i.e., predominantly C or T: cytosine or thymine); these genes encode proteins with a high percentage of nonpolar amino acids. Bottom line, if a gene's codons are mostly T or C in the second position, that gene most likely encodes a membrane-associated protein. (See my previous post for some data.) This is true for all organisms (viruses, cells) and organellar genes, too, by the way, not just M. leprae. It's a generic feature of the genetic code.

When I segregated M. leprae pseudogenes according to whether or not the second codon base was (on average) less than, or more than, 40% purines, I stumbled onto something quite interesting. I found 51 pseudogenes with AG2 less than 40% (meaning, these are probably membrane-associated proteins). Of those, 32 (or 62%) are still "in frame," with AG1 > 55%. By contrast, the majority (78%) of non-membrane pseudogenes (AG2 > 40%) appear to be turned off, with an average AG1 of 51%.

Long story short: Most non-membrane-associated pseudogenes are out-of-frame (and likely dead), whereas 62% of putative membrane-associated pseudogenes appear to be in-frame, and therefore could still be functional (or at least, undead).

In looking at stop codons, I found that of the pseudogenes that still had stop codons, the average distance to the first stop codon is only 149 bases (whereas the average pseudogene length is 795 bases). Pseudogenes for putative membrane-associated proteins were shorter overall (as membrane proteins often are; 495 bases instead of 795), but the average distance to the first stop codon was 190 bases, significantly longer than for the other pseudogenes. This suggests some of them are still alive.

By now you're probably wondering how the heck a pseudogene can be of any possible use whatsoever when it contains a premature stop codon. The thing we need to ask, though, is why M. leprae tolerates (indeed conserves) so many pseudogenes in the first place. Could it be that the organism has adapted a frameshift-tolerant translation apparatus? Maybe some of the stop codons aren't really stop codons.

We know that a wide variety of organisms (not just viruses, where this phenomenon was first discovered, but bacteria and eukaryotes) have evolved special signals to tell ribosomes to shift in and out of frame by plus or minus one. (See "A Gripping Tale of Ribosomal Frameshifting: Extragenic Suppressors of Frameshift Mutations Spotlight P-Site Realignment," Atkins and Björk, Microbiol. Mol. Biol. Rev. 2009.) Certain tRNAs participate in "quadruplet codon" decoding, making it possible for special frameshift signals to work. The signals usually involve 7-base-long "slippery heptamer" sequences, such as CCCTGAC, right where a stop codon (TGA) appears. In other words, when a stop codon appears inside a slippery heptamer, it's not really a stop codon. Depending on the kinds (and amounts) of tRNAs "on duty," it can be a frameshift signal.

When I looked for CCCTGAC in M. leprae's pseudogenes, I found 16 in-frame occurrences of the sequence in 1116 pseudogenes. (Only 7 occurrences of the hexamer CCCTGA were found, in frame, in M. leprae's "normal" genes.) While this doesn't prove that M. leprae is up to any unusual translation tricks, it's a tantalizing result. Also bear in mind, if M. leprae is indeed up to some unusual tricks, it may very well be using frameshift signals other than (or in addition to) CCCTGAC. The fact that Mycobacterium species lack a MutS/MutL mismatch repair system means M. leprae may have adapted different ways of coping with "slippery repeats."

Further work will be needed to confirm whether M. leprae indeed translates some of its pseudogenes into proteins. The 32 "high likelihood" pseudogenes that, according to my analysis, might still encode functional (or at least expressed) membrane-associated proteins are shown in the table below. Leave a comment if you have additional thoughts.

M. leprae pseudogenes that have codons with overall AG1 > 55% and AG2 < 40%:

Pseudogene Possible product
MLBr00146 hypothetical protein
MLBr00189 hypothetical protein
MLBr00278 conserved hypothetical protein
MLBr00341 hypothetical protein
MLBr00460 hypothetical protein
MLBr00478 hypothetical protein
MLBr00738 PstA component of phosphate uptake
MLBr00836 hypothetical protein
MLBr00846 ABC transporter
MLBr01054 possible PPE-family protein
MLBr01156 hypothetical protein
MLBr01237 possible cytochrome P450
MLBr01238 probable cytochrome P450
MLBr01400 possible membrane protein
MLBr01414 PGRS-family protein
MLBr01474 hypothetical protein
MLBr01527 dihydrodipicolinate reductase
MLBr01673 conserved hypothetical protein
MLBr01792 probable Na+/H+ exchanger
MLBr01968 PE family protein
MLBr02003 probable ketoacyl reductase
MLBr02101 conserved hypothetical protein
MLBr02150 molybdopterin converting factor subunit 1
MLBr02190 PstA component of phosphate uptake
MLBr02216 dihydrolipoamide dehydrogenase
MLBr02363 19 kDa antigenic lipoprotein
MLBr02477 PE protein
MLBr02484 transcriptional regulator (LysR family)
MLBr02533 PE-family protein
MLBr02656 conserved hypothetical protein
MLBr02674 possible membrane protein

by Kas Thomas ( at April 12, 2014 07:52 PM

Doc Searls Weblog

Weekend linkings




Journalistic selfies


Revisiting Hart Island

by Doc Searls at April 12, 2014 06:35 PM

What's Best Next

The Traditional View of Productivity vs. Gospel-Driven Productivity

Traditional View (TV): Do more in less time.
Gospel-Driven Productivity (GDP): Do the right things, and you can care a lot less about efficiency.

TV: Use the right techniques.
GDP: Be the right kind of person. Then, use smart techniques.

TV: Seek peace of mind and fulfillment.
GDP: Seek to do good for others first, and make a contribution. Peace and fulfillment will follow (and so will suffering!—but of a different kind).

TV: Minimize work and maximize money.
GDP: Do hard things, find joy in your work as a fulfillment of your calling. Maximize meaning, not money.

by mattperman at April 12, 2014 05:15 PM

Embedded in Academia

A New Development for Coverity and Heartbleed

As a consequence of my last post, I spent some time on the phone Friday with Andy Chou, Coverity’s CTO and former member of Dawson Engler’s research group at Stanford, from which Coverity spun off over a decade ago. Andy confirmed that Coverity does not spot the heartbleed flaw and said that it remained stubborn even when they tweaked various analysis settings. Basically, the control flow and data flow between the socket read() from which the bad data originates and the eventual bad memcpy() is just too complicated.

Let’s be clear: it is trivial to create a static analyzer that runs fast and flags heartbleed. I can accomplish this, for example, by flagging a taint error in every line of code that is analyzed. The task that is truly difficult is to create a static analysis tool that is performant and that has a high signal to noise ratio for a broad range of analyzed programs. This is the design point that Coverity is aiming for, and while it is an excellent tool there is obviously no general-purpose silver bullet: halting problem arguments guarantee the non-existence of static analysis tools that can reliably and automatically detect even simple kinds of bugs such as divide by zero. In practice, it’s not halting problem stuff that stops analyzers but rather code that has a lot of indirection and a lot of data-dependent control flow. If you want to make a program that is robustly resistant to static analysis, implement some kind of interpreter.

Anyhow, Andy did not call me to admit defeat. Rather, he wanted to show off a new analysis that he and his engineers prototyped over the last couple of days. Their insight is that we might want to consider byte-swap operations to be sources of tainted data. Why? Because there is usually no good reason to call ntohs() or a similar function on a data item unless it came from the network or from some other outside source. This is one of those ideas that is simple, sensible, and obvious — in retrospect. Analyzing a vulnerable OpenSSL using this new technique gives this result for the heartbleed code:

Nice! As you might guess, additional locations in OpenSSL are also flagged by this analysis, but it isn’t my place to share those here.

What issues are left? Well, for one thing it may not be trivial to reliably recognize byte-swapping code; for example, some people may use operations on a char array instead of shift-and-mask. A trickier issue is reliably determining when untrusted data items should be untainted: do they need to be tested against both a lower and upper bound, or just an upper bound? What if they are tested against a really large bound? Does every bit-masking operation untaint? Etc. The Coverity people will need to work out these and other issues, such as the effect of this analysis on the overall false alarm rate. In case this hasn’t been clear, the analysis that Andy showed me is a proof-of-concept prototype and he doesn’t know when it will make it into the production system.

by regehr at April 12, 2014 04:01 PM

Blog & Mablog

Relationships: Bedrock Discipleship

On Palm Sunday, we remember the Lord’s entry into Jerusalem shortly before He was betrayed, condemned, and executed. As we reflect on this moment in His mission, we should take care to remember what that mission was. His mission was not just to save people, it was also to save a people.

The Text:

“And the multitudes that went before, and that followed, cried, saying, Hosanna to the Son of David: Blessed is he that cometh in the name of the Lord; Hosanna in the highest” (Matt. 21:9).

Summary of the Text:

There are many things that can be drawn out of this story, but this morning, we are just going to focus on one of them. When Jesus arrived in Jerusalem—where He was to be lifted up and draw all men to Himself—He was greeted by multitudes. Contrary to the popular assumption that the Triumphal Entry crowd and the “crucify Him” crowd were the same people, we have no reason for identifying them. These people who greeted Him were doing so sincerely. Jesus was approaching Jerusalem in order to save multitudes, and He was greeted there by multitudes. Their central cry was Hosanna, which means “Save, we pray.” In other words, we are praying that You would save us. “Yes,” He answered.

Two Questions:

Back in the seventies, the great question was what is truth? Today the pressing question is where is community? Some might make this kind of observation in order to set the questions against one another, but rightly understood they are complementary questions. Truth is foundational to any true community, and community is the only appropriate response to the truth. “If we say that we have fellowship with him, and walk in darkness, we lie, and do not the truth” (1 John 1:6). Fellowship exults in the truth, and truth generates fellowship.


The biblical word for fellowship is koinonia, and  here is how the idea connects to our text. To welcome Christ into Jerusalem you have to go down to the street He is on. When you do so, you are not just praising Him as He travels by. You also have a necessary relationship to those people on your right and left who are also praising Him. Christ was welcomed to the week of His passion by a crowd, and not by the last true believer. Save us, they cried, and that is what He did.

But the crowd had to come to Christ. They could not have gone two blocks over, turned and faced each other, and establish a little koinonia by themselves. It never works.

In modern church parlance, fellowship means coffee and donuts. But in the biblical world, fellowship meant mutual partaking and indwelling. Fellowship is what we have in the body together, as we are being knit together in love.

One Another:

A body is what we are. We do not act in a particular way in order to become a body, we are to act that way because we are a body and desire to be a well-functioning one. “So we, being many, are one body in Christ, and every one members one of another” (Rom. 12:5).


When it comes to life in the body, there are all kinds of offenses. There are business offenses. There are family offenses. There is petty rudeness in the parking lot, and there is glaring sin within a marriage. What in the world are we to do with other people? “Wherefore receive ye one another, as Christ also received us to the glory of God” (Rom. 15:7).

It glorified God when Christ received us, and it glorifies Him when we receive one another. When we receive a brother or sister, we are not promising to “look the other way.” That is not biblical receiving. We are promising to let love cover it, when that is appropriate, and to confront it, when that is appropriate. We are promising to not complain about it to others. We either cover it or confront it, and this principled communion is why it is possible to excommunicate in love.


Of course the center of this is love. When we look at the “one anothers” of Scripture, this has a central place. “A new commandment I give unto you, That ye love one another; as I have loved you, that ye also love one another” (John 13:34). “By this shall all men know that ye are my disciples, if ye have love one to another” (John 13:35). “This is my commandment, That ye love one another, as I have loved you” (John 15:12). “These things I command you, that ye love one another” (John 15:17). “Owe no man any thing, but to love one another: for he that loveth another hath fulfilled the law” (Rom. 13:8).

We can only love because we have been loved. And we can only know that we have been loved if we grasp—through a living faith—the glories of the gospel. Christ died and was buried, Christ was buried and rose, and He did it so that you might be put right with God. You are ushered into the fellowship of love that He offers, and this is what makes it possible for you to love your neighbor.


But it is very tempting for us to conceive of love as a generic disposition to “be nice.” But love rolls up its sleeves, and gets into the dirty work. If all we had to do was sit around and radiate love rays at one another, I am sure we would all be up to the task. But what about all those provocations that come from . . . you know, other people?

We begin by making sure that we do not rise to the provocations. We need to have peace with one another. One of the characteristics of the band that traveled with Jesus is that He had to caution them to preserve the peace with each other. “Salt is good: but if the salt have lost his saltness, wherewith will ye season it? Have salt in yourselves, and have peace one with another” (Mark 9:50).

We should labor to think alike. We noted earlier that truth is the foundation of community, and the more we share in the truth, and walk in it, the greater will be our unity. “Now the God of patience and consolation grant you to be likeminded one toward another according to Christ Jesus” (Rom. 15:5). Our modern temptation is that of simply “agreeing to disagree,” which is fine as a temporary measure—but it is not the ultimate goal that Scripture sets out for us.

But the “one anothers” we pursue should not be limited to staying out of fights. “Be kindly affectioned one to another with brotherly love; in honour preferring one another” (Rom. 12:10). Scripture tells us to point the honor away from ourselves, and toward the other.


As the people of God, we are being gathered. But we cannot be gathered without being gathered together. And once we are gathered together, we face the glorious calling of life together. But in order to maintain this, we have to keep emphasizing the basics—gospel, love, forgiveness, truth.

by Douglas Wilson at April 12, 2014 03:56 PM

Dressing Up Without Playing Dress Ups

One of the things we need to remember when it comes to church architecture is that a building is corporate clothing. A building is how the whole church dresses. The trick is how to dress up without playing dress ups.

Now we have taught for years that worship ought to be respectful and dignified, not breezy and casual. We do not take out ads in the paper inviting the unchurched to come in their pjs. We are supposed to worship God with reverence and godly fear, and this includes our demeanor, and our demeanor includes our clothing (Heb. 12:28). Paul rejoiced that the worship of the Colossians was in good order (Col. 2:5).

Some of you may have noticed that I am not dressed as I usually am. The way I usually dress is intended to communicate respect, but dressing this way does not mean disrespect—it only means that the language of respect can vary. But whenever anything is done week after week without ever varying it, the unspoken assumption can take root in a congregation that this is the way it is done. And from that to petty liturgical idolatry is just a few short steps. This is even more the case when the dress is explicitly ecclesiastical—robes and so forth.

What does this have to do with a building? If a building is our corporate clothing, and it will be, after we have been worshiping there for fifty years, if the pastor then notices that the congregation has gotten attached in the wrong way, he can’t change it up for a week or two in order to make a point. The thing is built out of stone. The trick is how to keep a stone building from creating stone hearts. It is supposed to go the other way. Living hearts of flesh make the building glorious . . . and clothing we can use simply as a way of speaking the truth.

So let the stones cry out.

by Douglas Wilson at April 12, 2014 03:52 PM

Grace is High When Hearts are Low

This meal presents God’s answer to the problem of evil. It does this in at least two respects. The first thing God wants to do with evil is forgive it, cleanse it, wash it away. His eternal design, established before all worlds, was to populate the resurrection with untold millions of forgiven sinners. The apostle John saw a multitude that no one could number standing before the throne. And here on this Table we see the foundation of that forgiveness. If Christ had not died under the wrath of God, as a propitiation for our sins, we would all be utterly lost. And that is the meaning of the broken bread here. That is the meaning of the red wine in the cup. Christ died in the place of sinners.

But a second answer to the problem of evil concerns the mere existence of evil. Why did God allow evil to come into existence at all? The answer is that so we might come to understand in practice how the lowliness of humility overthrows the greatness of the proud. We overcome evil when our graces are high, but our graces are only high when our hearts are low.

When our hearts are low, we see things in a spiritual light. A humble man will never think that anything is little or small if Christ is there. And that same humble man, if Christ is not there, will laugh at every pretended greatness. Now—is Christ here now? Is this a little thing? You will shortly have a morsel of bread in your hand, and you will have a taste of wine on your lips. What will that do? It throws down the principalities and powers, and it does so because Christ is here—as we see in evangelical faith. It does so because it embodies the gospel of grace, and that grace means that God was at work in the cross. What did He do? In humility, Christ became a propitiation, and we were forgiven. In humility, He modeled that humility for us to follow after we were forgiven, so that we might become the agents of extending His kingdom from the river to the ends of the earth.

So come, and welcome, to Jesus Christ.

by Douglas Wilson at April 12, 2014 03:51 PM

Front Porch Republic

Jesse Winchester, Southern Regionalist, RIP

Jesse Winchester, a tuneful poet from a small corner of southern America who had to flee America–at a time when it was going through one of its more invasively imperial stages–to find his voice passed away yesterday. Anyone who knows the work of The Band or Reba McEntire or Jimmy Buffett needs no introduction to the story of this greatly talented and mostly ignored songwriter; for everyone else, check out his work in these two clips, and ponder what great voices emerge from little places:

The post Jesse Winchester, Southern Regionalist, RIP appeared first on Front Porch Republic.

by Russell Arben Fox at April 12, 2014 03:33 PM

Alarming Development

See you at Strange Loop

Two announcements. First, the official Call for Submissions is up.
Second, we will be at StrangeLoop too. We are partnering with Alex Payne and his Emerging Languages Camp to run FPW on the day before StrangeLoop. You can submit for SPLASH or StrangeLoop or both. See the Call for more details.
Now you have twice the motivation to do a killer demo. Get on it!

by Jonathan Edwards at April 12, 2014 02:42 PM

Planet Lisp

Colin Lupton: Self-seeding Context Added to CL-ISAAC

As recommended by Bob Jenkins, the original author of the ISAAC cryptographic random number generator algorithms, self-seeding ISAAC is a useful technique for increasing the cryptographic strength of the random numbers generated from a given ISAAC context; i.e., using random values generated by ISAAC to seed a new ISAAC context. This may not seem particularly valuable for one-off random values such as the session tokens generated in CL-ISAAC’s documented Quick Recipes, but when you need to generate millions of cryptographically-strong random numbers from a single context—such as for a One-Time Pad cipher—you notice the extra strength that self-seeding provides.

CL-ISAAC v1.0.4 is now available on GitHub, which includes the self-seeding context. It will be available in the April distribution of Quicklisp.


Using the Self-seed context is similar to the other seeds already available; the function supports both ISAAC-32 and ISAAC-64 algorithms, and provides one additional keyword parameter, count, which specifies the number of rounds your ISAAC context will be self-seeded. The default value is 1, but a count greater-than 3 is recommended.

Usage is as straight-forward as the other contexts. To create a 512bit hexadecimal string token using the ISAAC-64 algorithm from a self-seeded context with 3 rounds:

* (ql:quicklisp "cl-isaac")
* (defvar *self-ctx* (isaac:init-self-seed :count 3 :is64 t))
* (format nil "~64,'0x" (isaac:rand-bits-64 *self-ctx* 512))

The Self-seeding context is necessarily heavier than the kernel and cl:random seeds, by a factor of approx. 5n+1, where n is the number of self-seeding rounds. Specifically, for every round there is an additional context created, as well as an additional scramble.


April 12, 2014 01:39 PM


Custom Vector Allocation

(Also posted to, number 6 in a series of posts about Vectors and Vector based containers.)
A few posts back I talked about the idea of ‘rolling your own’ STL-style vector class, based my experiences with this at PathEngine.In that original post and these two follow-ups I talked about the general approach and also some specific performance tweaks that actually helped in practice for our vector use cases.I haven’t talked about custom memory allocation yet, however. This is something that’s been cited in a number of places as a key reason for switching away from std::vector so I’ll come back now and look at the approach we took for this (which is pretty simple, but nonstandard, and also pre C++11), and assess some of the implications of using this kind of non-standard approach.

I approach this from the point of view of a custom vector implementation, but I’ll be talking about some issues with memory customisation that also apply more generally.

Why custom allocation?

In many situations it’s fine for vectors (and other containers) to just use the same default memory allocation method as the rest of your code, and this is definitely the simplest approach.

(The example vector code I posted previously used malloc() and free(), but works equally well with global operator new and delete.)

But vectors can do a lot of memory allocation, and memory allocation can be expensive, and it’s not uncommon for memory allocation operations to turn up in profiling as the most significant cost of vector based code. Custom memory allocation approaches can help resolve this.

And some other good reasons for hooking into and customising allocations can be the need to avoid memory fragmentation or to track memory statistics.

For these reasons generalised memory customisation is an important customer requirement for our SDK code in general, and then by extension for the vector containers used by this code.

Custom allocation in std::vector

The STL provides a mechanism for hooking into the container allocation calls (such as vector buffer allocations) through allocators, with vector constructors accepting an allocator argument for this purpose.

I won’t attempt a general introduction to STL allocators, but there’s a load of material about this on the web. See, for example, this article on Dr Dobbs, which includes some example use cases for allocators. (Bear in mind that this is pre C++11, however. I didn’t see any similarly targeted overview posts for using allocators post C++11.)

A non-standard approach

We actually added the possibility to customise memory allocation in our vectors some time after switching to a custom vector implementation. (This was around mid-2012. Before that PathEngine’s memory customisation hooks worked by overriding global new and delete, and required dll linkage if you wanted to manage PathEngine memory allocations separately from allocations in the main game code.)

We’ve generally tried to keep our custom vector as similar as possible to std::vector, in order to avoid issues with unexpected behaviour (since a lot of people know how std::vector works), and to ensure that code can be easily switched between std::vector and our custom vector. When it came to memory allocation, however, we chose a significantly different (and definitely non-standard) approach, because in practice a lot of vector code doesn’t actually use allocators (or else just sets allocators in a constructor), because we already had a custom vector class in place, and because I just don’t like STL allocators!

Other game developers

A lot of other game developers have a similar opinion of STL allocators, and for many this is actually then also a key factor in a decision to switch to custom container classes.

For example, issues with the design of STL allocators are quoted as one of the main reasons for the creation of the EASTL, a set of STL replacement classes, by Electronic Arts. From the EASTL paper:

Among game developers the most fundamental weakness is the std allocator design, and it is this weakness that was the largest contributing factor to the creation of EASTL.

And I’ve heard similar things from other developers. For example, in this blog post about the Bitsquid approach to allocators Niklas Frykholm says:

If it weren’t for the allocator interface I could almost use STL. Almost.

Let’s have a look at some of the reasons for this distaste!

Problems with STL allocators

We’ll look at the situation prior to C++11, first of all, and the historical basis for switching to an alternative mechanism.

A lot of problems with STL allocators come out of confusion in the initial design. According to Alexander Stepanov (primary designer and implementer of the STL) the custom allocator mechanism was invented to deal with a specific issue with Intel memory architecture. (Do you remember near and far pointers? If not, consider yourself lucky I guess!) From this interview with Alexander:

Question: How did allocators come into STL? What do you think of them?

Answer: I invented allocators to deal with Intel’s memory architecture. They are not such a bad ideas in theory – having a layer that encapsulates all memory stuff: pointers, references, ptrdiff_t, size_t. Unfortunately they cannot work in practice.

And it seems like this original design intention was also only partially executed. From the wikipedia entry for allocators:

They were originally intended as a means to make the library more flexible and independent of the underlying memory model, allowing programmers to utilize custom pointer and reference types with the library. However, in the process of adopting STL into the C++ standard, the C++ standardization committee realized that a complete abstraction of the memory model would incur unacceptable performance penalties. To remedy this, the requirements of allocators were made more restrictive. As a result, the level of customization provided by allocators is more limited than was originally envisioned by Stepanov.

and, further down:

While Stepanov had originally intended allocators to completely encapsulate the memory model, the standards committee realized that this approach would lead to unacceptable efficiency degradations. To remedy this, additional wording was added to the allocator requirements. In particular, container implementations may assume that the allocator’s type definitions for pointers and related integral types are equivalent to those provided by the default allocator, and that all instances of a given allocator type always compare equal, effectively contradicting the original design goals for allocators and limiting the usefulness of allocators that carry state.

Some of the key problems with STL allocators (historically) are then:

  • Unnecessary complexity, with some boiler plate stuff required for features that are not actually used
  • A limitation that allocators cannot have internal state (‘all instances of a given allocator type are required to be interchangeable and always compare equal to each other’)
  • The fact the allocator type is included in container type (with changes to allocator type changing the type of the container)

There are some changes to this situation with C++11, as we’ll see below, but this certainly helps explain why a lot of people have chosen to avoid the STL allocator mechanism, historically!

Virtual allocator interface

So we decided to avoid STL allocators, and use a non-standard approach.

The approach we use is based on a virtual allocator interface, and avoids the need to specify allocator type as a template parameter.

This is quite similar to the setup for allocators in the BitSquid engine, as described by Niklas here (as linked above, it’s probably worth reading that post if you didn’t see this already, as I’ll try to avoid repeating the various points he discussed there).

A basic allocator interface can then be defined as follows:

class iAllocator
    virtual ~iAllocator() {}
    virtual void* allocate(tUnsigned32 size) = 0;
    virtual void deallocate(void* ptr) = 0;
// helper
    template <class T> void
    allocate_Array(tUnsigned32 arraySize, T*& result)
        result = static_cast<T*>(allocate(sizeof(T) * arraySize));

The allocate_Array() method is for convenience, concrete allocator objects just need to implement allocate() and free().

We can store a pointer to iAllocator in our vector, and replace the direct calls to malloc() and free() with virtual function calls, as follows:

    static T*
    allocate(size_type size)
        T* allocated;
        _allocator->allocate_Array(size, allocated);
        return allocated;
    reallocate(size_type newCapacity)
        T* newData;
        _allocator->allocate_Array(newCapacity, newData);
        copyRange(_data, _data + _size, newData);
        deleteRange(_data, _data + _size);
        _data = newData;
        _capacity = newCapacity;

These virtual function calls potentially add some overhead to allocation and deallocation. It’s worth being quite careful about this kind of virtual function call overhead, but in practice it seems that the overhead is not significant here. Virtual function call overhead is often all about cache misses and, perhaps because there are often just a small number of actual allocator instance active, with allocations tending to be grouped by allocator, this just isn’t such an issue here.

We use a simple raw pointer for the allocator reference. Maybe a smart pointer type could be used (for better modern C++ style and to increase safety), but we usually want to control allocator lifetime quite explicitly, so we’re basically just careful about this.

Allocators can be passed in to each vector constructor, or if omitted will default to a ‘global allocator’ (which adds a bit of extra linkage to our vector header):

    cVector(size_type size, const T& fillWith,
        iAllocator& allocator = GlobalAllocator()
        _data = 0;
        _allocator = &allocator;
        _size = size;
        _capacity = size;
            _allocator->allocate_Array(_capacity, _data);
            constructRange(_data, _data + size, fillWith);

Here’s an example concrete allocator implementation:

class cMallocAllocator : public iAllocator
    allocate(tUnsigned32 size)
        return malloc(static_cast<size_t>(size));
    deallocate(void* ptr)

(Note that you normally can call malloc() with zero size, but this is something that we disallow for PathEngine allocators.)

And this can be passed in to vector construction as follows:

    cMallocAllocator allocator;
    cVector<int> v(10, 0, allocator);

Swapping vectors

That’s pretty much it, but there’s one tricky case to look out for.

Specifically, what should happen in our vector swap() method? Let’s take a small diversion to see why there might be a problem.

Consider some code that takes a non-const reference to vector, and ‘swaps a vector out’ as a way of returning a set of values in the vector without the need to heap allocate the vector object itself:

class cVectorBuilder
    cVector<int> _v;
    //.... construction and other building methods
    void takeResult(cVector<int>& result); // swaps _v into result

So this code doesn’t care about allocators, and just wants to work with a vector of a given type. And maybe there is some other code that uses this, as follows:

void BuildData(/*some input params*/, cVector& result)
  //.... construct a cVectorBuilder and call a bunch of build methods

Now there’s no indication that there’s going to be a swap() involved, but the result vector will end up using the global allocator, and this can potentially cause some surprises in the calling code:

   cVector v(someSpecialAllocator);
   BuildData(/*input params*/, v);
   // lost our allocator assignment!
   // v now uses the global allocator

Nobody’s really doing anything wrong here (although this isn’t really the modern C++ way to do things). This is really a fundamental problem arising from the possibility to swap vectors with different allocators, and there are other situations where this can come up.

You can find some discussion about the possibilities for implementing vector swap with ‘unequal allocators’ here. We basically choose option 1, which is to simply declare it illegal to call swap with vectors with different allocators. So we just add an assert in our vector swap method that the two allocator pointers are equal.

In our case this works out fine, since this doesn’t happen so much in practice, because cases where this does happen are caught directly by the assertion, and because it’s generally straightforward to modify the relevant code paths to resolve the issue.

Comparison with std::vector, is this necessary/better??

Ok, so I’ve outlined the approach we take for custom allocation in our vector class.

This all works out quite nicely for us. It’s straightforward to implement and to use, and consistent with the custom allocators we use more generally in PathEngine. And we already had our custom vector in place when we came to implement this, so this wasn’t part of the decision about whether or not to switch to a custom vector implementation. But it’s interesting, nevertheless, to compare this approach with the standard allocator mechanism provided by std::vector.

My original ‘roll-your-own vector’ blog post was quite controversial. There were a lot of responses strongly against the idea of implementing a custom vector, but a lot of other responses (often from the game development industry side) saying something like ‘yes, we do that, but we do some detail differently’, and I know that this kind of customisation is not uncommon in the industry.

These two different viewpoints makes it worthwhile to explore this question in a bit more detail, then, I think.

I already discussed the potential pitfalls of switching to a custom vector implementation in the original ‘roll-your-own vector’ blog post, so lets look at the potential benefits of switching to a custom allocator mechanism.

Broadly speaking, this comes down to three key points:

  • Interface complexity
  • Stateful allocator support
  • Possibilities for further customisation and memory optimisation

Interface complexity

If we look at an example allocator implementation for each setup we can see that there’s a significant difference in the amount of code required. The following code is taken from my previous post, and was used to fill allocated memory with non zero values, to check for zero initialisation:

// STL allocator version
template <class T>
class cNonZeroedAllocator
    typedef T value_type;
    typedef value_type* pointer;
    typedef const value_type* const_pointer;
    typedef value_type& reference;
    typedef const value_type& const_reference;
    typedef typename std::size_t size_type;
    typedef std::ptrdiff_t difference_type;
    template <class tTarget>
    struct rebind
        typedef cNonZeroedAllocator<tTarget> other;
    cNonZeroedAllocator() {}
    ~cNonZeroedAllocator() {}
    template <class T2>
    cNonZeroedAllocator(cNonZeroedAllocator<T2> const&)
    address(reference ref)
        return &ref;
    address(const_reference ref)
        return &ref;
    allocate(size_type count, const void* = 0)
        size_type byteSize = count * sizeof(T);
        void* result = malloc(byteSize);
        signed char* asCharPtr;
        asCharPtr = reinterpret_cast<signed char*>(result);
        for(size_type i = 0; i != byteSize; ++i)
            asCharPtr[i] = -1;
        return reinterpret_cast<pointer>(result);
    void deallocate(pointer ptr, size_type)

    max_size() const
        return 0xffffffffUL / sizeof(T);
    construct(pointer ptr, const T& t)
        new(ptr) T(t);
    destroy(pointer ptr)
    template <class T2> bool
    operator==(cNonZeroedAllocator<T2> const&) const
        return true;
    template <class T2> bool
    operator!=(cNonZeroedAllocator<T2> const&) const
        return false;

But with our custom allocator interface this can now be implemented as follows:

// custom allocator version
class cNonZeroedAllocator : public iAllocator
    allocate(tUnsigned32 size)
        void* result = malloc(static_cast<size_t>(size));
        signed char* asCharPtr;
        asCharPtr = reinterpret_cast<signed char*>(result);
        for(tUnsigned32 i = 0; i != size; ++i)
            asCharPtr[i] = -1;
        return result;
    deallocate(void* ptr)

As we saw previously a lot of stuff in the STL allocator relates to some obsolete design decisions, and is unlikely to actually be used in practice. The custom allocator interface also completely abstracts out the concept of constructed object type, and works only in terms of actual memory sizes and pointers, which seems more natural and whilst doing everything we need for the allocator use cases in PathEngine.

For me this is one advantage of the custom allocation setup, then, although probably not something that would by itself justify switching to a custom vector.

If you use allocators that depend on customisation of the other parts of the STL allocator interface (other than for data alignment) please let me know in the comments thread. I’m quite interested to hear about this! (There’s some discussion about data alignment customisation below.)

Stateful allocator requirement

Stateful allocator support is a specific customer requirement for PathEngine.

Clients need to be able to set custom allocation hooks and have all allocations made by the SDK (including vector buffer allocations) routed to custom client-side allocation code. Furthermore, multiple allocation hooks can be supplied, with the actual allocation strategy selected depending on the actual local execution context.

It’s not feasible to supply allocation context to all of our vector based code as a template parameter, and so we need our vector objects to support stateful allocators.

Stateful allocators with the virtual allocator interface

Stateful allocators are straightforward with our custom allocator setup. Vectors can be assigned different concrete allocator implementations and these concrete allocator implementations can include internal state, without code that works on the vectors needing to know anything about these details.

Stateful allocators with the STL

As discussed earlier, internal allocator state is something that was specifically forbidden by the original STL allocator specification. This is something that has been revisited in C++11, however, and stateful allocators are now explicitly supported, but it also looks like it’s possible to use stateful allocators in practice with many pre-C++11 compile environments.

The reasons for disallowing stateful allocators relate to two specific problem situations:

  • Splicing nodes between linked lists with different allocation strategies
  • Swapping vectors with different allocation strategies

C++11 addresses these issues with allocator traits, which specify what to do with allocators in problem cases, with stateful allocators then explicitly supported. This stackoverflow answer discusses what happens, specifically, with C++11, in the vector swap case.

With PathEngine we want to be able to support clients with different compilation environments, and it’s an advantage not to require C++11 support. But according to this stackoverflow answer, you can also actually get away with using stateful allocators in most cases, without explicit C++11 support, as long as you avoid these problem cases.

Since we already prohibit the vector problem case (swap with unequal allocators), that means that we probably can actually implement our stateful allocator requirement with std::vector and STL allocators in practice, without requiring C++11 support.

There’s just one proviso, with or without C++11 support, due to allowances for legacy compiler behaviour in allocator traits. Specifically, it doesn’t look like we can get the same assertion behaviour in vector swap. If propagate_on_container_swap::value is set to false for either allocator then the result is ‘undefined behaviour’, so this could just swap the allocators silently, and we’d have to be quite careful about these kinds of problem cases!

Building on stateful allocators to address other issues

If you can use stateful allocators with the STL then this changes things a bit. A lot of things become possible just by adding suitable internal state to standard STL allocator implementations. But you can also now use this allocator internal state as a kind of bootstrap to work around other issues with STL allocators.

The trick is wrap up the same kind of virtual allocator interface setup we use in PathEngine in an STL allocator wrapper class. You could do this (for example) by putting a pointer to our iAllocator interface inside an STL allocator class (as internal state), and then forward the actual allocation and deallocation calls as virtual function calls through this pointer.

So, at the cost of another layer of complexity (which can be mostly hidden from the main application code), it should now be possible to:

  • remove unnecessary boiler plate from concrete allocator implementations (since these now just implement iAllocator), and
  • use different concrete allocator types without changing the actual vector type.

Although I’m still not keen on STL allocators, and prefer the direct simplicity of our custom allocator setup as opposed to covering up the mess of the STL allocator interface in this way, I have to admit that this does effectively remove two of the key benefits of our custom allocator setup. Let’s move on to the third point, then!

Refer to the bloomberg allocator model for one example of this kind of setup in practice (and see also this presentation about bloomberg allocators in the context C++11 allocator changes).

Memory optimisation

The other potential benefit of custom allocation over STL allocators is basically the possibility to mess around with the allocation interface.

With STL allocators we’re restricted to using the allocate() and deallocate() methods exactly as defined in the original allocator specification. But with our custom allocator we’re basically free to mess with these method definitions (in consultation with our clients!), or to add additional methods, and generally change the interface to better suit our clients needs.

There is some discussion of this issue in this proposal for improving STL allocators, which talks about ways in which the memory allocation interface provided by STL allocators can be sub-optimal.

Some customisations implemented in the Bitsquid allocators are:

  • an ‘align’ parameter for the allocation method, and
  • a query for the size of allocated blocks

PathEngine allocators don’t include either of these customisations, although this is stuff that we can add quite easily if required by our clients. Our allocator does include the following extra methods:

    virtual void*
            void* oldPtr,
            tUnsigned32 oldSize,
            tUnsigned32 oldSize_Used,
            tUnsigned32 newSize
            ) = 0;
// helper
    template <class T> void
            T*& ptr,
            tUnsigned32 oldArraySize,
            tUnsigned32 oldArraySize_Used,
            tUnsigned32 newArraySize
        ptr = static_cast<T*>(expand(
            sizeof(T) * oldArraySize,
            sizeof(T) * oldArraySize_Used,
            sizeof(T) * newArraySize

What this does, essentially, is to provide a way for concrete allocator classes to use the realloc() system call, or similar memory allocation functionality in a custom head, if this is desired.

As before, the expand_Array() method is there for convenience, and concrete classes only need to implement the expand() method. This takes a pointer to an existing memory block, and can either add space to the end of this existing block (if possible), or allocate a larger block somewhere else and move existing data to that new location (based on the oldSize_Used parameter).

Implementing expand()

A couple of example implementations for expand() are as follows:

// in cMallocAllocator, using realloc()
        void* oldPtr,
        tUnsigned32 oldSize,
        tUnsigned32 oldSize_Used,
        tUnsigned32 newSize
        assert(oldSize_Used <= oldSize);
        assert(newSize > oldSize);
        return realloc(oldPtr, static_cast<size_t>(newSize));
// as allocate and move
        void* oldPtr,
        tUnsigned32 oldSize,
        tUnsigned32 oldSize_Used,
        tUnsigned32 newSize
        assert(oldSize_Used <= oldSize);
        assert(newSize > oldSize);
        void* newPtr = allocate(newSize);
        memcpy(newPtr, oldPtr, static_cast<size_t>(oldSize_Used));
        return newPtr;

So this can either call through directly to something like realloc(), or emulate realloc() with a sequence of allocation, memory copy and deallocation operations.

Benchmarking with realloc()

With this expand() method included in our allocator it’s pretty straightforward to update our custom vector to use realloc(), and it’s easy to see how this can potentially optimise memory use, but does this actually make a difference in practice?

I tried some benchmarking and it turns out that this depends very much on the actual memory heap implementation in use.

I tested this first of all with the following simple benchmark:

template <class tVector> static void
PushBackBenchmark(tVector& target)
    const int pattern[] = {0,1,2,3,4,5,6,7};
    const int patternLength = sizeof(pattern) / sizeof(*pattern);
    const int iterations = 10000000;
    tSigned32 patternI = 0;
    for(tSigned32 i = 0; i != iterations; ++i)
        if(patternI == patternLength)
            patternI = 0;

(Wrapped up in some code for timing over a bunch of iterations, with result checking to avoid the push_back being optimised out.)

This is obviously very far from a real useage situation, but the results were quite interesting:

OS container type time
Linux std::vector 0.0579 seconds
Linux cVector without realloc 0.0280 seconds
Linux cVector with realloc 0.0236 seconds
Windows std::vector 0.0583 seconds
Windows cVector without realloc 0.0367 seconds
Windows cVector with realloc 0.0367 seconds

So the first thing that stands out from these results is that using realloc() doesn’t make any significant difference on windows. I double checked this, and while expand() is definitely avoiding memory copies a significant proportion of the time, this is either not significant in the timings, or memory copy savings are being outweighed by some extra costs in the realloc() call. Maybe realloc() is implemented badly on Windows, or maybe the memory heap on Windows is optimised for more common allocation scenarios at the expense of realloc(), I don’t know. A quick google search shows that other people have seen similar issues.

Apart from that it looks like realloc() can make a significant performance difference, on some platforms (or depending on the memory heap being used). I did some extra testing, and it looks like we’re getting diminishing returns after some of the other performance tweaks we made in our custom vector, specifically the tweaks to increase capacity after the first push_back, and the capacity multiplier tweak. With these tweaks backed out:

OS container type time
Linux cVector without realloc, no tweaks 0.0532 seconds
Linux cVector with realloc, no tweaks 0.0235 seconds

So, for this specific benchmark, using realloc() is very significant, and even avoids the need for those other performance tweaks.

Slightly more involved benchmark

The benchmark above is really basic, however, and certainly isn’t a good general benchmark for vector memory use. In fact, with realloc(), there is only actually ever one single allocation made, which is then naturally free to expand through the available memory space!

A similar benchmark is discussed in this stackoverflow question, and in that case the benefits seemed to reduce significantly with more than one vector in use. I hacked the benchmark a bit to see what this does for us:

template <class tVector> static void
PushBackBenchmark_TwoVectors(tVector& target1, tVector& target2)
    const int pattern[] = {0,1,2,3,4,5,6,7};
    const int patternLength = sizeof(pattern) / sizeof(*pattern);
    const int iterations = 10000000;
    tSigned32 patternI = 0;
    for(tSigned32 i = 0; i != iterations; ++i)
        if(patternI == patternLength)
            patternI = 0;
template <class tVector> static void
PushBackBenchmark_ThreeVectors(tVector& target1, tVector& target2, tVector& target3)
    const int pattern[] = {0,1,2,3,4,5,6,7};
    const int patternLength = sizeof(pattern) / sizeof(*pattern);
    const int iterations = 10000000;
    tSigned32 patternI = 0;
    for(tSigned32 i = 0; i != iterations; ++i)
        if(patternI == patternLength)
            patternI = 0;

With PushBackBenchmark_TwoVectors():

OS container type time
Linux std::vector 0.0860 seconds
Linux cVector without realloc 0.0721 seconds
Linux cVector with realloc 0.0495 seconds

With PushBackBenchmark_ThreeVectors():

OS container type time
Linux std::vector 0.1291 seconds
Linux cVector without realloc 0.0856 seconds
Linux cVector with realloc 0.0618 seconds

That’s kind of unexpected.

If we think about what’s going to happen with the vector buffer allocations in this benchmark, on the assumption of sequential allocations into a simple contiguous memory region, it seems like the separate vector allocations in the modified benchmark versions should actually prevent each other from expanding. And I expected that to reduce the benefits of using realloc. But the speedup is actually a lot more significant for these benchmark versions.

I stepped through the benchmark and the vector buffer allocations are being placed sequentially in a single contiguous memory region, and do initially prevent each other from expanding, but after a while the ‘hole’ at the start of the memory region gets large enough to be reused, and then reallocation becomes possible, and somehow turns out to be an even more significant benefit. Maybe these benchmark versions pushed the memory use into a new segment and incurred some kind of segment setup costs?

With virtual memory and different layers of memory allocation in modern operating systems, and different approaches to heap implementations, it all works out as quite a complicated issue, but it does seem fairly clear, at least, that using realloc() is something that can potentially make a significant difference to vector performance, in at least some cases!

Realloc() in PathEngine

Those are all still very arbitrary benchmarks and it’s interesting to see how much this actually makes a difference for some real uses cases. So I had a look at what difference the realloc() support makes for the vector use in PathEngine.

I tried our standard set of SDK benchmarks (with common queries in some ‘normal’ situations), both with and without realloc() support, and compared the timings for these two cases. It turns out that for this set of benchmarks, using realloc() doesn’t make a significant difference to the benchmark timings. There are some slight improvements in some timings, but nothing very noticeable.

The queries in these benchmarks have already had quite a lot of attention for performance optimisation, of course, and there are a bunch of other performance optimisations already in the SDK that are designed to avoid the need for vector capacity increases in these situations (reuse of vectors for runtime queries, for example). Nevertheless, if we’re asking whether custom allocation with realloc() is ‘necessary or better’ in the specific case of PathEngine vector use (and these specific benchmarks) the answer appears to be that no this doesn’t really seem to make any concrete difference!

Memory customisation and STL allocators

As I’ve said above, this kind of customisation of the allocator interface (to add stuff like realloc() support) is something that we can’t do with the standard allocator setup (even with C++11).

For completeness it’s worth noting the approach suggested by Alexandrescu in this article where he shows how you can effectively shoehorn stuff like realloc() calls into STL allocators.

But this does still depends on using some custom container code to detect special allocator types, and won’t work with std::vector.


This has ended up a lot longer than I originally intended so I’ll go ahead and wrap up here!

To conclude:

  • It’s not so hard to implement your own allocator setup, and integrate this with a custom vector (I hope this post gives you a good idea about what can be involved in this)
  • There are ways to do similar things with the STL, however, and overall this wouldn’t really work out as a strong argument for switching to a custom vector in our case
  • A custom allocator setup will let you do some funky things with memory allocation, if your memory heap will dance the dance, but it’s not always clear that this will translate into actual concrete performance benefits

A couple of things I haven’t talked about:

Memory fragmentation: custom memory interfaces can also be important for avoiding memory fragmentation, and this can be an important issue. We don’t have a system in place for actually measuring memory fragmentation, though, and I’d be interested to hear how other people in the industry actually quantify or benchmark this.

Memory relocation: the concept of ‘relocatable allocators’ is quite interesting, I think, although this has more significant implications for higher level vector based code, and requires moving further away from standard vector usage. This is something I’ll maybe talk about in more depth later on..

** Comments: Please check the comment thread on the original post before commenting. **

by Thomas Young at April 12, 2014 01:17 PM


Chrysologus for Lent XXXIX

Whoever is free from captivity to this mammon, and is no longer weighed down under the cruel burden of money, stands securely with his vantage point in heaven, and from there looks down over the mammon which is holding sway over the world and the worldly with a tyrant's fury.

It holds sway over nations, it gives orders to kingdoms, it wages wars, it equips warriors, it traffics in blood, it transacts death, it threatens homelands, it destroys cities, it conquers peoples, it attacks fortresses, it puts citizens in an uproar, it presides over the marketplace, it wipes out justice, it confuses right and wrong, and by aiming directly at morality it assails one's integrity, it violates truth, it eviscerates one's reputation, it wreaks havoc on one's honor, it dissolves affections, it removes innocence, it keeps compassion buried, it severs relationships, it does not permit friendship. And why should I say more? This is mammon: the master of injustice, since it is unjust in the power it wields over human bodies and minds.

Sermon 126, section 5.

by Brandon ( at April 12, 2014 12:05 PM

assertTrue( )

The Most Deadly Pathogen of All Time

Few bacterial species have had as great an impact on humankind as the members of the Mycobacterium family, which encompass the causative agents of (among other ailments) leprosy, tuberculosis, and Crohn's Disease in humans, and Johne's Disease in farm animals. Leprosy is known from antiquity and continues to strike 200,000 or more people each year worldwide. Tuberculosis, which affects (subclinically) one in three persons worldwide, continues to kill well over a million people a year and has caused a billion deaths in the last two centuries, more than all the wars and genocides of history combined.

The association of M. avium subspecies paratuberculosis (MAP) with Crohn's Disease is still considered controversial by some, but if in fact Koch's criteria have already been met, MAP adds millions more to the toll of human misery caused by Mycobacterial infection.

Colonies of Mycobacterium have a
characteristically waxy consistency.
Shown here: colonies of M. tuberculosis.
What are these bacteria? Where did they come from? How have they managed to be so successful in causing death and disease?

The prefix "myco" means fungal, but these are not fungi we're talking about. Mycobacteria are soil- and water-borne bacteria that produce an extraordinarily complex cell wall containing not only the usual (for bacteria) peptidoglycans but also:
  • Arabinogalactan
  • Mycolic acids
  • Lipoarabinomannan
  • Extractable lipids including glycolipids, phenolic glycolipids (PGL), glycopeptidolipids (GPL), waxes, acylated trehaloses, and sulfolipids
In contrast to most cell-wall fatty acids (which contain carbon-carbon double bonds susceptible to oxidation), mycolic acids are cyclopropanated and resistant to oxidation, not to mention extremely hydrophobic. The Mycobacterial cell wall thus presents a formidable physical barrier to antibiotics, and it was with considerable dismay that physicians realized, early on, that penicillin would have no benefit in treating tuberculosis. When an antibiotic that could attack M. tuberculosis was finally discovered (streptomycin), it resulted in a 1952 Nobel Prize for Ukrainian American Selman Waksman (although in reality the discovery was made by a post-doc in Waksman's lab, Albert Schatz).

The Mycobacterial cell wall is famously complex, but it also has the curious habit of disappearing entirely, under nutrient-starvation conditions. Like many other bacteria, Mycobacteria can, under certain conditions, shed their cell walls and take on a so-called L-form morphology, in which cells (bounded only by a thin and osmotically vulnerable cell membrane containing just 7% of the usual amount of peptidoglycan) exist as protoplasts which are nonetheless able to reproduce and thrive, producing distinctive colonies on solid media and giving rise, in vivo, to tiny spherules that are often confused with Russell bodies in cancer biopsies. The medical significance of the mysterious L-forms is still debated, after more than 100 years.

The very small red filaments here are cells of  
Mycobacterium avium living inside lymph-node
macrophages in an immunocompromised individual.
One thing most Mycobacterial species have in common is slow growth. Cultures of M. tuberculosis and MAP often require weeks to develop, and M. leprae (which can't be grown in pure culture at all; it can be lab-grown only in the footpads of mice or armadillos) has the longest known generation time of any bacterium, at two weeks.

Ironically, pathogenic strains of Mycobacterium seem to have evolved slow growth as a survival strategy. (This certainly makes them hard to treat with antibiotics. Most antibiotics are effective only in disrupting the growth of actively growing cells.) The lack of DNA mismatch repair enzyme systems (MutS, MutL, and MutH) may be an outcome of the fact that slow DNA replication in these organisms, in and of itself, ensures reasonably high-fidelity replication. On the other hand, lack of a mismatch repair system could be why pseudogenes (genes inactivated due to frameshifts or other errors) abound in Mycobacterial species. M. leprae famously has over 1000 pseudogenes; M. smegmatis strain JS623 harbors over 200 pseudogenes; M. canettii (strain CIPT 140010059) and M. rhodesiae (strain NBB3) both have over 100. (For a good review of Mycobacterial DNA repair systems, see this 2011 paper.)

Unlike Yersinia pestis, the plague organism, which may be less than 20,000 years old (very young in bacterial species time), M. tuberculosis, as a species, appears to be at least 3 million years old, although this number should probably be considered a minimum age, subject to upward revision. (The species was thought to be only 35,000 years old as late as 2002, before a more detailed genetic analysis established the 3-million-year estimate of its age. The numbers should be viewed with caution, however, since they're based on mutation-rate assumptions derived from data for E. coli.)

The question of how M. tuberculosis has managed to achieve its distinctive pathogenic profile is a matter of active ongoing research, and likely will be for a long time. A recent review article reminds us: "The [complete genome] sequence of the pathogen Mycobacterium tuberculosis strain H37Rv has been available for over a decade, but the biology of the pathogen remains poorly understood."

Miscellaneous Links
List of famous T.B. victims—Brontë family, Balzac, Kafka, Thoreau, Kant, Chekhov, Orwell, Schrödinger, Vivien Leigh, Arline Feynman (wife of the famous physicist), the list goes on.
Tuberculosis in Literature and the Arts
The T.B. Blues (Jimmie Rodgers, 1931) This song, famously covered by Leon Redbone (among others), was written by Rodgers after he contracted the disease at age 27. He died eight years later.
World Health Organization TB Stats (landing page)
The Tuberculosis Systems Biology Program

by Kas Thomas ( at April 12, 2014 04:00 AM

April 11, 2014

The Tech Report - News

Friday night topic: the unwritten rules of tipping

When you think about it, giving extra money to folks who do certain jobs is kind of a weird tradition. Yet we practice it regularly. What's weirdest about tipping is that it seems to follow an unwritten set of conventions that we're largely supposed to know, apparently by osmosis.

Generally speaking, I'm happy to tip, and I fall mostly on the generous side. I usually give 20% of the check (and I ...


April 11, 2014 10:04 PM

Watch Dogs trailer shows PC eye candy

When the last Watch Dogs trailer came out in March, I expressed some disappointment about the game's looks. Well, another clip turned up on YouTube this morning, and this one is all about the eye candy in the game's PC release. I have to say, it looks better than I thought:



April 11, 2014 09:54 PM

Cool Tools

Wink’s remarkable book picks of the week

Wink is Cool Tools’ website that reviews one remarkable paper book every weekday. We take photos of the covers and the interior pages of the books to show you why we love them.

This week we reviewed books about excellent optical illusions, the events of one day in WW1 told in the form one long continuous pen drawing printed on a fold-out scroll, Ernest Shackleton’s brave yet disastrous attempt to cross the Antarctic continent, hundreds of science-themed tattoos worn by working scientists, the sketchbooks of artists from around the world, and the greatest comic books ever published in a bound slipcase.

Take a look at these books and many others at Wink.

-- Mark Frauenfelder

by mark at April 11, 2014 09:33 PM

John C. Wright's Journal

The Diskos

A reader has a question about AWAKE IN THE NIGHT:

Mr Wright,
I’m having a little trouble visualizing the Diskos. Sometimes I see it as a lance with a cone shaped spinning shaft or blade and sometimes as an ax like weapon. Am I right with either visualization? Ah I have reached the part of the second story that describes a Diskos without the blade.

Happy to help! The diskos looks like a pizza cutter. Or maybe like a unicycle.

I do not have the paragraph in front of me, but I believe that the narrator calls the ‘blade’ of the weapon by the term ‘diskos’ that is, the disk-shaped cutting blade, but he also uses this term for the whole weapon, the same way we might call a sword ‘a blade’.

Maybe a picture will make this clear.

This is what the diskos weapon looks like. It is a two-handed axe whose blade is like a living, electrified sawblade,, if you lived in a universe where electricity was also life-energy pulsing with psychic force.

X4With the blade dismounted, the forks look like this:


by John C Wright at April 11, 2014 09:33 PM

Schneier on Security

CrossFit Naptown

“Partner Dead – Cindy” & Box Crawl for a Cause Today


20 Minute AMRAP
“Partner Dead – Cindy”

Partner 1: Complete rounds of Cindy
5 Pull Ups
10 Push Ups
15 Air Squats

Partner 2:
Hold Barbell in locked out Deadlift Position (135/95)





Notes for Saturday Box Crawl with a Cause
Ends around 1:15pm with a delicious paleo-certified meal from Artie’s Paleo on the Go and even a beer if you wish!

ALL Members: Around 1:30pm. Invite friends and family to join-in to celebrate the completion of the Open and to hang out with your fellow gym members outside of a WOD. Bring outdoor games if you have them. Artie’s will be here with limited food, so you may want to bring some food too.




Box Crawl for a Cause | Leukemia & Lymphoma Society Fundraiser 

The Cause
A loyal Crossfitter and friend at CrossFit NapTown has been asked to run for Man of the Year – an annual campaign for the state of Indiana dedicated to raising money for the Leukemia & Lymphoma Society. Being asked to run for Man of the Year is both an honor and a commitment. Josh Driver (the candidate) has committed to raising a minimum of $10,000 – but has a personal goal of $75,000. He approached our gym to leverage our community and we thought it was a great reason to make something happen that would reach beyond our gym. We have 15 spots for you – our members – you in? Read on!

For more information: see his website @ or

The Event
Give Big. Get Huge. The Josh Driver Show is bringing you a Saturday filled with wall balls, bellow parallel squats and hand stand pushups … and several other pain and smile inducing moves! Join in for an afternoon tour of some Indianapolis boxes for $75 – and a good cause. Bring your desire to pump some iron and have fun, we’ll take care of the rest!

The Details
Saturday, April 12; 9:00am Start Time and 1pm End Time *Tailgate and lunch to follow

We will have bus transportation from Crossfit Naptown then to Crossfit Unbreakable then to Crossfit Zionsville, ending back downtown.

Register Now!

*A password is required to access the eventbrite event – our password is NAPTOWN.

Cost: $75 But you get a lot … and it’s for a great cause

- Box Crawl Hat!
- Artie’s Paleo on the goal meal post-crawl at the Tailgate Party (at CFNT)
- Transportation to three boxes around the city
- Three ½ hour WODS (partner style, fun for all!)
- Camaraderie, fun and the sense you gave back while doing something for yourself!


Day-Of Agenda
9:00am meet at Crossfit Naptown
9:30am, bus leaves Crossfit Naptown (with all participants) on board for Crossfit Zionsville
10:00am bus arrives at Crossfit Zionsville
10:45am bus leaves for Crossfit Unbreakable
11:15am bus arrives at Crossfit Unbreakable
12:00pm bus leaves for Crossfit Naptown
12:30pm bus arrives at Crossfit Naptown
End of transportation, eat drink and be merry! :)

by Coach Jared at April 11, 2014 09:00 PM

Crossway Blog

Digital Roundup – 4/11/14

Digital Header Image

1. Social e-reading app Readmill acquired by Dropbox, being discontinued.

Readmill, one of the e-reader apps we like to recommend, recently announced its discontinuation. Read their epilogue here. We plan to provide an updated list of recommended e-reader apps soon!

2. Barnes and Noble Nook update available.

Barnes and Noble released a firmware update for the Nook Glowlight. They also announced plans to release a new tablet later this year. Good news Nook users: you can purchase e-books from Crossway and download them from your virtual bookshelf and upload them onto your device.

3. E-readers make increased reading easier, study shows.

Are you an avid reader? According to a recent study, owners of e-readers are more likely to read than the average person. For an easy way to keep up with daily Bible reading, check out our ESV Daily Reading Bible e-book that portions the text into 365 daily readings.

4. Mentors for your Christian life can be found on your (virtual) bookshelf.

In this article, the Village Church offers 3 tips on how to transform reading into a deeply personal experience to aid in your spiritual formation. One great place to start would be Crossway’s Theologians on the Christian Life series, which explores the lives, writings, and theology of the great teachers of the Christian faith. Pick up your tablet and begin a conversation in the margins with wise saints such as John CalvinFrancis SchaefferDietrich BonhoefferJohn Wesley, & B.B. Warfield.

For more digital and tech updates, follow us on Twitter (@CrosswayDigital).

by Matt Tully at April 11, 2014 08:49 PM

The Outlaw Way


We should have both Outlaw Power, and Outlaw Connectivity (gymnastics skill work) live by Monday’s post. Along with the new Outlaw Barbell template, this will will make up the body of supplemental work which each of you will be able to use to taper to your individual needs. I will explain the direction of each of these, and exactly how you should use them on Monday’s post.

Along with the new supplemental templates, we will be announcing some big changes to the future of our training camps. These changes will give you guys even more options to learn from our team of coaches. Our full three-day training camps will remain the base of our educational camp series, and will still cover a broad range of topics. The next series will include new topics and changes to make them fresh for everyone attending, while still providing the foundation of instruction that has built this site/program. However, after realizing it was impossible to cover as much detail as we would like, in a more broad setting, we’ve decided to add two new specialty camps which will be focused on providing the kind of precise development we obsess about.

The first will be a dedicated Outlaw Barbell seminar featuring Head Coach and National Champion – Jared Fleming, along with his Coach and Father – Dave Fleming. The second will be an Outlaw “Skills and Drills” (aka Gymnastics) seminar featuring Three-Time Games Competitor – Daniel Tyminski, and new Outlaw Connectivity Head Coach and former Cornell gymnast – Katilin Hardy (she’s also a Level 10 coach, and was formerly ranked in the top 20 in the country as a competitor). Needless to say – the attention to detail at all camps will be astronomical, and we feel that we’ve put together one of the best overall education teams in all of strength and conditioning.

If any of you would be interested in hosting any of our camps, please email

As always, perfection of movement remains the obsession.

I could be wrong, but I believe this is the first ever – in Outlaw land – of what I would consider to be the “holy grail” of the Clean. Jay Rhodes has been through a long, injured off season, and has finally gotten back to training hard. “White Lightning” has always been pretty good with a barbell – especially for a guy that weighs a hair under 175 pounds. Yesterday he PR’d once, then hit the first DOUBLE BODYWEIGHT Clean in Outlaw history. Here’s his 350# Clean from blocks…

WOD 140412:


1) 12 minutes to establish a 1RM Snatch.

2) 12 minutes to establish a 1RM Clean & Jerk.


For time:

50′ Front Rack Walking Lunges 95/65#
15 Deficit HSPU 6/4″
30 OHS 95/65#
12 Deficit HSPU 6/4″
24 OHS 95/65#
9 Deficit HSPU 6/4″
18 OHS 95/65#
50′ Front Rack Walking Lunges 95/65#

The post 140412 appeared first on The Outlaw Way.

by at April 11, 2014 08:47 PM

512 Pixels

On the NSA and Heartbleed →

Michael Riley at Bloomberg:

The U.S. National Security Agency knew for at least two years about a flaw in the way that many websites send sensitive information, now dubbed the Heartbleed bug, and regularly used it to gather critical intelligence, two people familiar with the matter said.

The NSA’s decision to keep the bug secret in pursuit of national security interests threatens to renew the rancorous debate over the role of the government’s top computer experts.

So, instead of alerting the American people (and Internet users everywhere) of this shockingly-bad bug in OpenSSL, the federal government took advantage of it, possibly using private keys to decrypt data it had gathered via its myriad of tools.

A year ago, I wouldn't have believed this. When the Heartbleed news broke the other day, I just assumed the government was using the exploit to spy on people. Hell, part of me thinks the NSA was behind it in the first place.

Update: The NSA has released a statement saying it was unaware of the OpenSSL bug until it was publicly disclosed. The language used is pretty clear. Make of that what you will, but the fact that we have to have this discussion is simply terrible.


by Stephen Hackett at April 11, 2014 07:53 PM

Daniel Lemire's blog

Probabilities and the C++ standard

The new C++ standard introduced hash functions and hash tables in the language (as “unordered maps”).

As every good programmer should know, hash tables only work well if collisions between keys are rare. That is, if you have two distinct keys k1 and k2, you want their hash values h(k1) and h(k2) to differ most of the time.

The C++ standard does not tell us how the keys are hashed but it gives us two rules:

  • The value returned by h(k) shall depend only on the argument k.
  • For two different values k1 and k2, the probability that h(k1) and h(k2) “compare equal” (sic) should be very small.

The first rule says that h(k) must be deterministic. This is in contrast with languages like Java where the hash value can depend on a random number if you want (as long as the value remains the same through throughout the execution of a given program).

It is a reasonable rule. It means that if you are iterating through the keys of an “unordered set”, you will always visit the keys in the same order… no matter how many times you run your program.

It also means, unfortunately, that if you find two values such that h(k1) and h(k2), then they will always be equal, for every program and every execution of said programs.

The second rule is less reasonable. We have that h(k1) and h(k2) are constant values that are always the same. There is no random model involved. Yet, somehow, we want that the probability that they will be the same be low.

I am guessing that they mean that if you pick k1 and k2 randomly, the probability that they will hash to the same value is low, but I am not sure. If it is what they mean, then it is a very weak requirement: a vendor could simply hash strings down to their first character. That is a terrible hash function!

I am under the impression that the next revision of the C++ standard will fix this issue by following in Java’s footstep and allow hash functions to vary from one run of a program to another. That is, C++ will embrace random hashing. This will help us build safer software.

by Daniel Lemire at April 11, 2014 07:48 PM

Blog & Mablog

How the Pinning Works

I want to spend a few moments on why the penal substitution of Christ is the only possible ground of human happiness. My point is not to defend the doctrine here — that has been ably done by others — but rather to show one of the many glorious outworkings of the doctrine. In our life together, whether that life is being lived in family, church, or town, the substitutionary death of Jesus is the only thing that can keep us from becoming scolds who are impossible to live with.

This is what I mean, and I will use marriage for my example. Husbands are told to love their wives as Christ loved the Church, and gave Himself up for her (Eph. 5:25). Now, whatever it is we believe that He did there, that is what we are going to imitate.

Unless you believe that at the heart of the atonement we find a complete identification of Christ with His people, then what you will imitate is that same failure to identify. But if you understand that the cross was the place where God went “all in,” then your love for your wife and family will likewise be all in.

If Jesus was just setting an example, or just doing some other thing external to us, then our imitation of this will tend toward the bossy and censorious. How many moral examples are crushing examples? How many things done for us, outside of us, designed to make us grateful, are actually burdens that are being tied on our backs by Pharisees? But Christ’s example and Christ’s gifts to us are not like that at all. They are true liberation. Why? Because He died in our place, and only because He died in our place.

If we take that away, then morality ceases to be liberation, and becomes what little we learn in lectures full of scolding and hectoring, and finger pointing. It becomes the kind of righteousness that the devil loves to go on about.

We are never exercising biblical authority over others unless we are identifying with them as we do so. In order to identify this way, we need an example — because we don’t think this way naturally. To use Chesterton’s image, we tend to bestow honor by pinning a cross on a hero, while God did it by pinning a hero on a cross.

And unless our sins were pinned there with him, we have no hope in our lives together. No hope at all.

by Douglas Wilson at April 11, 2014 06:46 PM

John C. Wright's Journal

And More Pictures!

After posting a number of pictures of eye-candy, I should also show you some pictures which excite me even more.

Yes, of course, I mean the Gallardo’s cover paintings for a few of my favorite of Lin Carter’s Ballatine Adult Fantasy series.

Dream Quest of Unknown Kadath by H.P. Lovecraft


Beyond the Golden Stair by Hannes Bok

Gervasio_Gallardo (2)

Lud-in-the-Mist By Hope Mirrlees

Gervasio_Gallardo (3)

Those who have read these books will recognize the elements of the story, faithfully if phantasmagorically rendered: the cats of Ulthar, the steps beyond the Gates of Deeper Slumber, the zebra of the mighty isle of Baharna, the fabulous sunset city glimpsed only in a dream three time by Randolph Carter.

Likewise, the nostalgic reader will recognize the strange elfin fruit from Fairyland floating down the river Dapple, or the flower-women of the planet Voltap in the triple-sun system ruled by the listless tyrant-enchanter Maal Dweb.

The Well at the World’s End by William Morris

Gervasio_Gallardo (4)

Poseidonis by Clark Ashton Smith

Gervasio_Gallardo (5)

Xiccarph by Clark Ashton Smith


Some delight me as much for the titles as the pictures. Well at the World’s End, in my humble opinion, is of all titles for a fantasy novel the most fantastic, more redolent of a sense of eerie wonder, unless it were, perhaps, the even more eerie title Dream Quest of Unknown Kadath. These titles promise us that they are set, in the words of Lord Dunsany ‘Beyond the Fields we Know’. To this day, Lud-in-the-Mist brings a wry half-smile to my lip, whereas Xiccarph echoes in my heart like the strange blare of far oriental trumpets.

by John C Wright at April 11, 2014 06:45 PM

Parchment and Pen

Theology Unplugged: Church (Part 5) – Ordination

Join Michael Patton, Tim Kimberley and Sam Storms as they continue their new series on the Church. This is a topic hotly debated today. What really does it take to be a church? Can three people meet at a coffee shop and call themselves a church? Do churches need to have elders? What about an online church?

There are so many questions being asked today about the Church in the 21st century. This series seeks to dive into the prominent issues of Ecclesiology (the study of the Church).

Theology Unplugged: Video Edition is available for the first time to Credo House Members. You can now listen AND WATCH as Michael, Tim, Sam and JJ dive into issues of theology. Grow in your faith, learn theology, and have a good time. Try Membership risk free! If you don’t love it as much as us you can cancel at any time

SubscribeSimilar Posts:

by Tim Kimberley at April 11, 2014 06:18 PM

John C. Wright's Journal

Would You Like a Slice of Cheesecake?

Allow me to present some pictures of the lovely Donna Reed, to which a reader brought my attention:



And a picture of Veronica Lake:


Now, I am normally a firm believer that women in the 1950s and 1940s were, on the whole, cuter than modern women, but there are exceptions. This young lady is named Lara Stone.


On exception is Jennifer Connelly, who is certainly easy on the eyes:

6a00e54ee7b6428833016305b70536970d-500wi 6a00e54ee7b6428833016766ab10c5970b-800wi sotjth tumblr_ma3q4lcncK1qgjwy2o1_1280



Also, Hayley Atwell is quite attractive, despite that she is not from the 1950s or 1940s:

Captain-America_Hayley-Atwell-red-dress_Image-credit-Paramount-Pictures captain-america-the-first-avenger-movie-image-53 hayley-atwell-captain-america-film-movie-poster-hd-desktop-wallpaper-screensaver-backgroundNormally I do not find Christina Aguilera particularly attractive, but for some reason in this video, she seems to have discovered how to be really appealing. I am not sure what it is. It cannot be merely some quirk of my own, right?

candyman candyman02Also, Rachel Weisz, at least in certain of her films, is remarkably attractive:

rachel rachel02rachel3

Joan Collins is lovely, also:


There is something about certain of these modern actresses which I honestly think are equal in glamor to those of the prior generation. I just cannot put my finger on what it is. Maybe it is because they appear in Science Fiction flicks?

Karen Gillan:

karen01 ????????????? karen-gillan-4 karengillannnnJenna Coleman:

Jenna Coleman jenna article-2210523-14CA9B5B000005DC-291_306x620

by John C Wright at April 11, 2014 06:11 PM

Schneier on Security

More on Heartbleed

This is an update to my earlier post.

Cloudflare is reporting that it's very difficult, if not practically impossible, to steal SSL private keys with this attack.

Here's the good news: after extensive testing on our software stack, we have been unable to successfully use Heartbleed on a vulnerable server to retrieve any private key data. Note that is not the same as saying it is impossible to use Heartbleed to get private keys. We do not yet feel comfortable saying that. However, if it is possible, it is at a minimum very hard. And, we have reason to believe based on the data structures used by OpenSSL and the modified version of NGINX that we use, that it may in fact be impossible.

The reasoning is complicated, and I suggest people read the post. What I have heard from people who actually ran the attack against a various servers is that what you get is a huge variety of cruft, ranging from indecipherable binary to useless log messages to peoples' passwords. The variability is huge.

This xkcd comic is a very good explanation of how the vulnerability works. And this post by Dan Kaminsky is worth reading.

I have a lot to say about the human aspects of this: auditing of open-source code, how the responsible disclosure process worked in this case, the ease with which anyone could weaponize this with just a few lines of script, how we explain vulnerabilities to the public -- and the role that impressive logo played in the process -- and our certificate issuance and revocation process. This may be a massive computer vulnerability, but all of the interesting aspects of it are human.

EDITED TO ADD (4/12): We have one example of someone successfully retrieving an SSL private key using Heartbleed. So it's possible, but it seems to be much harder than we originally thought.

And we have a story where two anonymous sources have claimed that the NSA has been exploiting Heartbleed for two years.

EDITED TO ADD (4/12): Hijacking user sessions with Heartbleed. And a nice essay on the marketing and communications around the vulnerability

EDITED TO ADD (4/13): The US intelligence community has denied prior knowledge of Heatbleed. The statement is word-game free:

NSA was not aware of the recently identified vulnerability in OpenSSL, the so-called Heartbleed vulnerability, until it was made public in a private sector cybersecurity report. Reports that say otherwise are wrong.

The statement also says:

Unless there is a clear national security or law enforcement need, this process is biased toward responsibly disclosing such vulnerabilities.

Since when is "law enforcement need" included in that decision process? This national security exception to law and process is extending much to far into normal police work.

Another point. According to the original Bloomberg article:

Certainly a plausible statement. But if those millions didn't discover something obvious like Heartbleed, shouldn't we investigate them for incompetence?

Finally -- not related to the NSA -- this is good information on which sites are still vulnerable, including historical data.

by schneier at April 11, 2014 06:10 PM

The Brooks Review

xkcd: Heartbleed Explanation

And now everyone gets it.

Please consider becoming a member to support my writing. All writing is 100% member funded.

by Ben Brooks at April 11, 2014 06:02 PM

CrossFit 204

Workout: April 14, 2014

Congrats to Taylor on earning a spot at the Canada West Regional!

Congrats to Taylor on earning a spot at the Canada West Regional!

Work up to 1 heavy set of 10 lunges (5 per leg)

Squat 8-8-8-8

Rest no more than 2 minutes between sets


Overhead squat 8-8-8

Front squat 10-10-10

Back squat 12-12-12

Rest only 60 seconds between all sets. Adjust the load as needed between any set.

Rest as needed.

8 minutes of:

8 front-rack lunges (4 per leg, 155/110 lb.)

10 chest-to-bar pull-ups

by Mike at April 11, 2014 05:49 PM

Workout: April 13, 2014

Pam and Co.

Pam and Co.

Draggin’s Den

6 rounds of:

Row 250 meters

1 75-foot sled drag

8 burpees


Snatch balance: heavy single

Head to the bike path along Silver. Bring extra shoes. No mud will be allowed in the gym.

3 rounds of:

Run along the bike path from Inglewood to the footbridge and back

Rest 3 minutes


Session 1 (performed in skills):

Thruster ladder: starting at 135/105, add 10 lb. every 40 seconds until failure – don’t overload rubber plates with extra metal change plates – get someone to help you load appropriate rubber and metal to avoid crushing the bumpers

Snatch balance: heavy single

12 legless rope climbs from standing or seated

Session 2:

Head to the bike path along Silver. Bring extra shoes. No mud will be allowed in the gym.

3 rounds of:

Run along the bike path from Inglewood to the footbridge and back

Rest 3 minutes



by Mike at April 11, 2014 05:42 PM

Workout: April 12, 2014

Tyson's heading back to Vancouver and the Canada West Regional!

Tyson’s heading back to Vancouver and the Canada West Regional!

Power clean work

12 minutes:

Odd minutes: 8 shoulders to overheads

Even minutes: 8 power cleans


Session 1: Performed in class

Five sets of: touch-and-go power-clean triple + 1 jerk

12 minutes:

Odd minutes: 8 fat-bar STOs (155/115 lb.)

Even minutes: 8 fat-bar power cleans (155/115 lb.)

Session 2: noon

3 rounds of:

20 kettlebell swings (70/55 lb.)

10 handstand push-ups

3 rounds of:

20 wall-balls (20 lb., 10 ft.)

15 toes-to-bars

3 rounds of:

20 box jumps (24/20 inches)

5 muscle-ups

by Mike at April 11, 2014 05:36 PM

Blessed Earth » Blogs

Nancy Green

I believed in Christmas even before I believed in Christ.

First, some personal history: Matthew was raised in a church-on-Sundays Protestant home. His Christmas celebrations were small because there were five kids and not much money to spare. I was raised in a conservative Jewish home. My Christmas celebrations were nonexistent—we lit the menorah and played games with a spinning top called a dreidel.

I met Matthew when he was a carpenter putting a bay window in my parents’ house. I was a rather spoiled college freshman, home for December study week just before my first set of finals. Matthew says that my parents’ worst nightmare came true: their daughter fell in love with the carpenter. By February, we were dating. In April, he asked me to marry him. I told him I needed to ask my mother.

Two years later, we married. I had been telling Matthew that he was the smartest person I’d ever met and that maybe he should think about going to college. So he did. Because of a severe case of dyslexia and a lack of direction, he had graduated third from last in his high school class—and that was in the vocational program! But the calculator was invented in the meantime, so math and science were no longer stumbling blocks. Now Matthew had a clear goal: to become a doctor. He enrolled in a state college, worked hard, and excelled.

Fast forward seven years. Clark, our son, was born the month Matthew graduated from medical school. And still, we didn’t have any clear anchor, no faith traditions to harbor the storms. So we made it up as we went along—candy-filled baskets at Easter, matzo-ball soup for Passover, stockings at Christmas, and potato pancakes for Hanukkah. By the time our kids were in elementary school, they were so confused they thought the “fiddler on the roof” slid down the chimney, and if he saw his shadow, he laid an Easter egg.

My fondest Christmas memory from those early years occurred when Clark was three and Emma was one. Matthew’s parents had sent a tin of cookies packed in a box with lots of newspaper. At the time, one of Clark’s chores was to crumple newspaper each evening to start our fire. While Emma happily sampled each type of cookie, Clark smiled up at me, his face shining: “Look at what Santa gave me! Lots of newspaper already crumpled up!”

Our Christmas traditions grew along with the children. When Clark was five, some kindly neighbors felt sorry for him and gave us a small artificial Christmas tree and some decorations. A year later, we moved to our doctor’s-sized house and bought a doctor’s-sized spruce. Christmas Eve was always spent at our next-door neighbor’s, along with several dozen other families and a never-ending supply of jumbo shrimp and champagne. Because my friends all dropped by with picture-perfect handcrafted presents, I reciprocated. We exchanged artsy Christmas cards—more beautiful than anything Martha Stewart could ever dream up. We caroled, wrote letters to Santa, left food for the reindeer, woke up at dawn to open presents, and ate a Christmas breakfast big enough to feed an army of angels. But still, my family did not know God’s Son.

Eventually, some bad stuff happened in our lives—as it does to everyone. Just shy of his thirty-third birthday, my brother drowned while we were all on a family vacation. Matthew turned back to the faith of his youth, but this time it took. Jesus was alive and ready to share our burdens. One

Internet shellaced a greasy stinks azithromycin side effects leaves tedium Recently outside Functionally this of tetracycline for dogs of plastic with india online pharmacy you , anymore just prednisone for dogs dosage sheen cons and. This it. Use Bees looking and cheapest cialis had Real… Nothing is cialis generika clean. For as, buy zoloft great amount to cialis professional year like would they’ll used.

by one, everyone in our family came to Christ.

The funny thing is, the more we believed in Jesus, the less complicated our Christmas celebrations became. We stopped going to parties on Christmas Eve and started going to church. We stopped buying elaborate presents and started giving to special charities in each other’s names. Instead of designing the most impressive Christmas card, we made just one, stuffing its envelope with cash—the kids contributing from their piggy banks—and leaving it anonymously taped to the door of a needy family. We even convinced our extended family to exchange gifts with only one person, picking names out of a hat after the Thanksgiving meal.

Last Christmas was our best ever, and our simplest. I did no Christmas shopping. Emma asked if she could fill the stockings. She delighted in choosing a charity for each of us, finding used copies of The Woodland Folk series that we used to read when the kids were little, making gift certificates for breakfast in bed or a back rub, and ordering a favorite Christian comedian’s DVD for the whole family. We still made a big breakfast—just not quite so big. The Christmas cards, traveling, Christmas tree—all gone—along with the stress, trips to the mall, credit-card debt, and hectic holiday schedules. Instead, we read aloud the account of Christ’s birth. We stayed in our pajamas and watched the new DVD together.

It took a miracle—a Jewish girl learning to love Jesus—to put Christ at the center of our Christmas.

by brian at April 11, 2014 05:34 PM

Justin Taylor

Why Eugene Peterson Keeps Reading Calvin’s Institutes

Eugene Peterson, writing in Books & Culture:

Although I had been a pastor for a couple of years, I had little interest in theology. It was worse than that. My experience of theology was contaminated by adolescent polemics and hairsplitting apologetics. When I arrived at my university, my first impression was that the students most interested in religion were mostly interested in arguing. Theological discussions always seemed to set off a combative instinct among my peers. They left me with a sour taste. The grand and soaring realities of God and the Holy Spirit, Scripture and Jesus, salvation and creation and a holy life always seemed to get ground down into contentious, mean-spirited arguments: predestination and freewill, grace and works, Calvinism and Arminianism, liberal and conservative, supra- and infralapsarianism. The name Calvin was in particularly bad odor. I took refuge in philosophy and literature, where I was able to find companions for cultivating wonder and exploring meaning. When I entered seminary I managed to keep theology benched on the sidelines by plunging into the biblical languages.

But midway through [Douglas] Steere’s lecture, theology, and Calvin along with it, bounded off the bench. A new translation of the Institutes by Ford Lewis Battles (edited by John T. McNeill) had recently been published. I knew of the work of Dr. Steere and trusted him. But Calvin? And theology? After the hour’s lecture, most (maybe all) of my stereotyped preconceptions of both Calvin and theology had been dispersed. Steere was freshly energized by the new translation. He talked at length of the graceful literary style of the writing, the soaring architectural splendor of this spiritual classic, the clarity and beauty of the thinking, the penetrating insights and comprehensive imagination.

The lecture did its work in me—if Calvin was this good after four hundred years, I wanted to read his work for myself. The next day I went to a bookstore and bought the two volumes and began reading them. I read them through in a year, and when I finished I read them again. I’ve been reading them ever since.

by Justin Taylor at April 11, 2014 05:34 PM

John C. Wright's Journal

Read Matt Walsh

Simply a suburb article today by Matt Walsh.

As you might imagine, I was recently reacquainted with the rather sickening idea that I have a duty to show reverence for a political office, when I wrote a post last week where I merely called the president a liar. Indeed, anytime you criticize the president with an intent more serious than playfully teasing him for picking the wrong team in his March Madness bracket – anytime you attack authority, particularly presidential authority, particularly THIS president’s authority — the ‘respect the office’ propagators will come streaming in, fingers-a-wagging and heads-a-shaking.

‘Respect the office,’ they gush. Noticeably, the folks most concerned with respecting Obama’s office weren’t to be heard from during that certain eight year period where Bush was daily cut down as anything from Hitler Incarnate

Read the whole thing here.
I disagree with him on one technicality: officers in uniform must properly salute the Commander-in-Chief and proffer other signs of subordination and respect as military discipline requires. Otherwise, I agree wholly. As civilians, we need only proffer signs of respect for the Constitution that the President serves and we obey. Our respect for him is the minimum requires to maintain public order, and is abrogated when he acts against the office and the Constitution.

by John C Wright at April 11, 2014 05:30 PM


Living Under Cover

One of the things that struck me about Brendan Eich being forced out as CEO of Mozilla is precisely that this is not a guy who is known as some kind of a political activist. Indeed, some of his long time fellow executives (who once they heard of the donation joined in the sentiment that he did not belong at Mozilla) expressed shock that Eich even held such views.
Baker said that she had not known about Eich’s views on gay marriage throughout most of their working relationship, until the donation came to light last year.

“That was shocking to me, because I never saw any kind of behavior or attitude from him that was not in line with Mozilla’s values of inclusiveness,” she said, noting that there was a long and public community process about what to do about it in which Eich, then CTO, participated. “But I overestimated that experience.”

Baker — who became emotional at one point during the interview — noted that she was “doing a fair amount of self-reflection and I am wondering how did I miss it that this would matter more when he was the CEO.” [source]
I don't know if I'd claim to be that far under cover. I have a pretty strict personal policy of not talking about politics or religion at work unless directly asked, but the pictures of six kids at work are probably a give-away that there's something not-quite-savory about me from a modern secular perspective. Still, I figure that at work my job is to do pricing analytics, and so I tend to stick to business and lead a fairly under-cover life. This is enabled by the fact that in the companies I've worked for, there seems to be a general consensus that this is how it's done. The cultural hot topics generally don't get discussed, and so getting along isn't a matter of cravenly denying one's beliefs as everyone simply agreeing to leave contentious topics outside the door. If I'm asked to do something that violates my beliefs, I'm prepared to take a stand at that point, but I'm willing to not discuss things (including my beliefs) that make people uncomfortable so long as I'm not asked to actively violate those quietly held beliefs.

A while back, at another company, I had a boss who was gay and lived with a long term partner. He knew that I had a lot of kids and that I was Catholic, so he may have guessed that I had moral objections to his lifestyle -- or he may have assumed that I was some more enlightened form of Catholic who didn't agree with the Church. He didn't ask, and I didn't tell.

Living under cover can be a bit wearing at times, and I find myself applying it (without any real reason) to other areas of my life as well that have nothing to do with culture war controversy. I have a policy of never connecting with anyone from work on Facebook or mentioning that I have a blog, which seems like a straightforward, common sense sort of precaution. But other things I usually don't mention I don't even have a good reason for: hobbies, books I'm reading, working on a novel. Anything that seems a trifle too distinctive my first reaction is to not bother mentioning unless it comes up. (When I do run into a co-worker who I realize shares my religious and cultural worldview, that immediately creates a much closer bond in that he or she becomes one of the few people I can talk unguardedly with.)

All this can be a little wearing, but I'd always assumed it was simply the natural cost of being a cultural and religious minority (which although most Americans profess some form of Christianity it what being an orthodox Catholic nonetheless makes one in modern society.)

As such, the Eich affair has been particularly chilling, since the message that it sends is: "How you behave at work is not enough. If you dare to have beliefs as a person which you act on (no matter how far away from work you are when you do so) we will hunt you down and drive you out."

That doesn't particularly make me feel like living more "out and proud" at work in relation to my beliefs. My reticence is habitual at this point. And some of it is simply personality. But it is a good reminder of what we're up against. No matter how willing I am to keep myself to myself while in the office, the feeling will not necessarily always be mutual. It's a realization the breeds detachment -- at any time all that you have may be taken from you -- but also a certain feeling of spoiling for a fight.

by Darwin ( at April 11, 2014 05:26 PM

The Brooks Review

Drop Condoleezza Rice or we will #dropdropbox

I agree wholeheartedly:

Choosing Condoleezza Rice for Dropbox’s Board is problematic on a number of deeper levels, and invites serious concerns about Drew Houston and the senior leadership at Dropbox’s commitment to freedom, openness, and ethics. When a company quite literally has access to all of your data, ethics become more than a fun thought experiment.

Please consider becoming a member to support my writing. All writing is 100% member funded.

by Ben Brooks at April 11, 2014 05:25 PM

The Tech Report - News

Researchers demo new method of creating quantum logic

A pair of research teams at the Max Planck Institute in Germany and Harvard University have demonstrated a new type of quantum logic gate and switch that could form the basis of quantum computers, according to this report at Popular Mechanics.

The big advance, it seems, is establishing a reliable way to put a rubidium atom into the mind-bending state of superposition, where it is both "on" and "off." Not only that, but the researchers have managed to create a mechanism for propagating this state ...


April 11, 2014 04:49 PM

512 Pixels

Reeder 2 for Mac enters public beta →

Viticci has a run-down of the new beta. I've kicked ReadKit to the curb already.


by Stephen Hackett at April 11, 2014 04:30 PM

The Brooks Review

Oso Washington Mudslide

Great photos from Joshua Trujillo which document the devastation in Oso, Washington. Also a good use of Exposure.

Please consider becoming a member to support my writing. All writing is 100% member funded.

by Ben Brooks at April 11, 2014 04:25 PM

The Tech Report - News

Deal of the week: Savings on graphics, memory, storage, and Battlefield

What's this? A deals post that isn't brimming with cheap SSDs? Do go on.

North of the border, NCIX's Go Go Gizmo Savings event has a deluge of PC hardware deals. Among my favorites are Adata's XPC 16GB DDR3-1600 ...


April 11, 2014 04:08 PM

Weekly Fedora kernel bug statistics – April 11th 2014

  19 20 rawhide  
Open: 102 197 143 (442)
Opened since 2014-04-04 3 19 7 (29)
Closed since 2014-04-04 7 13 6 (26)
Changed since 2014-04-04 9 32 10 (51)

Weekly Fedora kernel bug statistics – April 11th 2014 is a post from:

by davej at April 11, 2014 04:07 PM

One Thing Well



Install custom fonts on your iPhone or iPad

App Store

April 11, 2014 04:00 PM

Front Porch Republic

Hemp, Hemp, Hooray!

hemp bound

From the perspective of a patriotic American who’s just researched hemp’s potential from Canada to Hawaii, Germany to Colorado, things are moving from fantasy to reality so quickly that it’s kind of making me believe in a societal version of…

Read Full Article...

The post Hemp, Hemp, Hooray! appeared first on Front Porch Republic.

by Doug Fine at April 11, 2014 03:55 PM

Justin Taylor

Who Were the Women at the Empty Tomb?


This Sunday is Palm Sunday, the beginning of Holy Week.

In our book The Final Days of Jesus Andreas Köstenberger and I try to provide some help in understanding the identity and  role of Jesus’s female disciples, especially with respect to their discovery of the empty tomb and their eyewitnesses testimony to the risen Christ.

There are a number of things about the narrative of the women that can perplexing when we seek to harmonize their actions across the four accounts. The sheer number of Marys sometimes adds to the confusion! And it even can be difficult to untangle the Greek grammar. For example, is John 19:25 about three women or four?

A. ”[1] his mother and [2] his mother’s sister, [3] Mary the wife of Clopas, and [4] Mary Magdalene”


B. ”[1] his mother and [2] his mother’s sister, [that is,] Mary the wife of Clopas, and [3] Mary Magdalene”

Under option A, the reference is likely to Salome (which would make the sons of Zebedee—James and John—the cousins of Jesus). However, option B is more likely, meaning that Mary the wife of Clopas is Mary’s sister (or sister-in-law) and thus Jesus’s aunt.

We don’t pretend to offer definitive solutions in our book, but I thought it might be helpful for those preaching or thinking through this material to highlight the relevant entries in our reference guide at the end of the book. There is more information on these important women than we have often recognized.

1. Joanna (wife of Chuza)

Among the first women to discover the empty tomb (Luke 24:10), she was the wife of Chuza, the household manager or steward of King Herod Antipas (Luke 8:3).

She was a follower of Jesus and helped to provide financially for Jesus’s ministry, along with Susanna and many others (Luke 8:3).

2. Mary Magdalene

A Galilean woman probably from the town of Magdala (on the west bank of the Sea of Galilee). Jesus delivered her from seven demons (Luke 8:2; Mark 16:9).

She became a follower of Jesus (Matt. 27:57), a witness to the crucifixion and burial (Matt. 27:61; 28:1; Mark 15:40, 47; John 19:25), and was among the women who went to the tomb on Sunday (Mark 16:1; John 20:1).

She was the first person to see Jesus alive (Mark 16:9) and told the other disciples (Luke 24:10; John 20:18).

3. Mary (mother of Jesus, widow of Joseph of Nazareth)

She gave birth to Jesus, raised him, was present at his execution and burial, and witnessed his resurrection life.

From the cross Jesus entrusted his widowed mother to John’s care, and she went to live in his home ( John 19:25-27)—perhaps because Mary’s other sons were not yet believers ( John 7:5; see also Matt. 13:57; Mark 3:21, 31; 6:4).

Mary’s other sons (Matt. 13:55; Mark 6:2-3; Acts 1:14; 1 Cor. 9:4-5; Gal. 1:19) were named:

  • James (author of the biblical book of James)
  • Joseph/Joses
  • Simon
  • Judas/Jude (author of the biblical book of Jude)

She also had at least two daughters (Mark 6:3).

4. Mary (mother of James and Joses/Joseph)

A witness of Jesus’s crucifixion, burial, and resurrection appearances.

Her sons were named James the Younger (hence her husband must have been named James) and Joses/Joseph. See Matt. 27:61; 27:56; Mark 15:40, 47.

The fact that two Marys in the story have sons with the same names (James and Joseph/Joses) shows the commonality of certain names in first-century Galilee. The name Mary, in particular, was exceedingly common in first-century Palestine, hence the need to distinguish between different Marys in the Gospels, whether by way of their hometown (Mary Magdalene) or in association with their husband (Mary of Clopas) or sons (Mary mother of James and Joses).

5. Mary (wife of Clopas)

A Galilean witness of Jesus’s crucifixion, she may be identified as Jesus’s “mother’s sister” ( John 19:25)—though see discussion under Salome below.

According to Hegesippus, as quoted by the historian Eusebius, Clopas was the brother of Joseph of Nazareth (Hist. Eccl. 3.11; 3.32.6; 4.22.4). If so, Mary and Clopas were Jesus’s aunt and uncle. Their son Simeon (Jesus’s cousin) became a leader of the Jerusalem church succeeding James the brother of Jesus.

6. Salome (mother of James and John)

One of Jesus’s female followers in Galilee, she witnessed the crucifixion and went to the tomb on Sunday (Mark 15:40; 16:1).

The parallel passage in Matthew 27:56 makes it likely that she is the mother of the sons of Zebedee (i.e., James and John).

by Justin Taylor at April 11, 2014 03:47 PM

Blog & Mablog

Because Conversion Turns Us

“I have not seen the same kind of willingness on the part of sacramentalists to admit that what they are telling us isn’t working, as measured by those indicators that the New Testament gives us as being inconsistent with inheriting the kingdom of God” (Against the Church, pp.75-76).

by Douglas Wilson at April 11, 2014 03:44 PM

The Thingology Blog

Come Learn PHP at ALA 2014


Summary: Tim, LibraryThing’s founder, is going to be giving a one-day, almost-free introduction to PHP programming on Friday, June 27, alongside the preconference day of ALA 2014 in Las Vegas, NV.

“Enough PHP to Be Dangerous” will cover the basics of PHP, the most common web programming language. It’s designed for people with little programming experience.(1)

Instruction will be project-based–a series of brief explanations followed by hands-on problem solving. You won’t emerge a PHP master, but you’ll know enough to be dangerous!(2)

We’ll presume some familiarity with the web, including basic HTML. You must bring your own laptop. We’ll ask you to set up a simple development environment before you come–we’ll send instructions. You should be connected to libraryland somehow. Prepare for a mental workout–there’s no point going slow when we only have a day.

Where? The session will be held Friday June 27, 9am-5pm at Embassy Suites Convention Center, three blocks from the Convention Center.

How do I sign up? Email Say who you are and put “Enough PHP to Be Dangerous” in the subject line.

We’ll close applications on Monday, April 14 at 4:00 PM EST. If more than 30 people sign up, we’ll pick the winners randomly. If fewer, we’ll allow people to sign up after the deadline on a first-come-first-served basis.

What Does it Cost? On the day of we’ll pass the hat, asking $55 to cover the $45 cost of hotel-provided muffins, coffee and sandwiches, and some of the cost of the room, equipment and wifi. If $55 is a hardship for you, no problem–we’ll waive the fee, and you’ll still get a sandwich.

Why do I need this? Libraryland needs more programmers, and people who know what programming is. Libary software vendors exert outsized power and too often produce lousy software because the community has limited alternatives. The more library programmers, the better.

Why are you doing this? Conferences are hugely expensive to exhibit at. They’re worth it, but it’s a shame not to do more. If we’re going to be out there anyway, adding a day, a room and a projector doesn’t add much to the cost, and could help the community. Also, I’m a frustrated former Latin teacher, so it’ll be fun for me!(3)

Is this officially connected to ALA, LITA, Library Code Year, etc.? Nope. We’re doing this on our own. It’s easier that way. Of course, we love all these groups, especially our friends at LITA.(4)

Will the class be broadcast? No. That sounds fiddly. Maybe another time.

Want to help out? If you’re a programmer and want to help make this happen, email me. It would be great to have another programmer or two helping people figure out why their script won’t run. It’ll be fun, and you can put it on your resume.

1. If you tried to learn something years ago, or do a little cutting and pasting of JavaScript, fine. If you’re a master of another programming language, you’ll be bored.
2. We’ll focus on the most basic skills–variables, loops, functions, etc. We’ll focus on non-OO PHP. We’ll print up some funny diplomas, so you can show off your new-found dangerousness back at the library.
3. Alas, the hotel doesn’t provide chalk boards.
4. We take inspiration from Introductory Python Workshop at ALA 2013, put together by Andromeda Yelton and others.

by Tim at April 11, 2014 02:45 PM


"Seeking Allah, Finding Jesus": The Book I Wish I Had 8 Years Ago


For a few months I worked for an upscale department store in D.C. before beginning my M.Div. program. It was a memorable work experience because it was the first time I had encountered Muslims.

There was the woman from Morocco. Ahmad, a half-Pakistani and half-Japanese young man more agnostic than devoted. And Olam, a second-generation twentysomething from Saudi Arabia whom I engaged in a handful of spiritual conversations over lunch.

I thoroughly enjoyed these friendships and interactions, yet I felt ill-prepared when it came to issues of faith. I didn’t understand Islam itself, let alone the Muslim experience, and there were few resources to equip me enter into that experience and help them find Jesus.

Nabeel Qureshi has solved this problem by writing the resource I needed. In Seeking Allah Finding Jesus, Qureshi uses his own dramatic journey from Islam to Christianity to equip us and our people to engage Muslims, walk with them through their spiritual journey, help them encounter Christianity, and find Jesus along the way.

This is the book I wish I had 8 years ago, because this book’s power lies in Qureshi’s own story. It transforms the grey-scale interfaith dialogue conversation into a full-color high-definition experience. To help us he uses his own story to share three vital elements we need to understand to help Muslims find Jesus:

Qureshi's Story Shows Us What It's Like to Be Muslim

First, Qureshi tears down walls by giving us non-Muslim readers an insider’s perspective into a Muslim’s heart and mind. He helps us non-Muslims understand what it’s like to be Muslim, which is where we must being.

In one illuminating chapter Qureshi explains how Muslims perceive people in the West, which affects our ability to impact them for Christ.

“[T]he average Muslim immigrant expects people in the West to be promiscuous Christians and enemies of Islam…When they come to America, their cultural differences and perceptions often cause them to remain isolated from Westerners.” (80)

He goes on to share that because of the many barriers for Muslim immigrants, “Only the exceptional blend of love, humility, hospitality can overcome [those barriers], and not enough people make the effort.” (80)

Qureshi's Story Teaches Us About Islam and the Quest for Christianity

Second, he equips us with the facts and knowledge about Islam in contrast with the strength of the gospel. While helpful callouts define key cultural and religious Islamic terms, the real power lies in how his story explores Islam and Christian beliefs.

One memorable story is one of his father ("Abba"), his college friend David, and New Testament scholars Mike Licona and Gary Habermas sharing a meal. In the story we learn that Muslims don’t believe Jesus actually died by crucifixion.

At one point Abba says, “it’s not possible that Jesus died on the cross. He was beloved by God, and he cried out to be saved. If there are verses that say he prophesied his death on the cross, those verses must have been added by Christians.” (152)

Both Licona and Habermas respond gently but deliberately to explain the unanimity of the historicity of Jesus’ death on the cross. This put Qureshi in an intellectual bind: “It seemed to me that if I wanted to hold onto an Islamic version of Jesus’ crucifixion…I would have to discard history.” (153)

Such a quandary forced Qureshi to “start considering it a remote possibility that the Christian message could possibly be true.” (154)

Qureshi's Story Reveals How Muslims Find Jesus

Finally, we experience the immense struggle Muslims have while grappling with the gospel and Jesus himself. This is a somewhat agonizing thread woven throughout the book, as it shows just how difficult it is to leave behind a faith that is deeply enmeshed in their lives. And there are costs to unweaving that mesh:

Following Jesus meant that I would immediately be ostracized from my community. For all Muslims, it means sacrificing friendships and social connections that they have built from childhood. It could mean being rejected by one’s parents, siblings, spouse, and children. (251)

We find out just how great a cost Qureshi himself would bear. And yet God was faithful. He provided community to support his journey, insight into his truth, and supernatural dreams to call him to accept the gospel.


"There is a simple reason I never listened to street preachers," Qureshi writes, "they didn't seem to care about me." Unfortunately many Christians approach evangelism the same way. Yet given the life change the gospel requires, "evangelism requires relationships." (121) 

This is especially true of Muslims, as I discovered in D.C. If you have a Muslim co-worker or neighbor, dive deep into their life. Qureshi and his story will help you, and, in turn, help them find Jesus through you.


Jb_headshotJeremy Bouma (Th.M.) is a pastor with the Evangelical Covenant Church in West Michigan. He is the founder of THEOKLESIA, a content curator dedicated to helping the 21st century church rediscover the historic Christian faith; holds a Master of Theology in historical theology; and writes about faith and life at

by Jeremy Bouma at April 11, 2014 02:44 PM


Are We Losing a Generation? (CaPC Podcast with @dandarling)

capcThis week at the Together for the Gospel conference I had a chance to hang out with my editor Richard Clark and connect with Dan Darling of the Ethic and Religious Liberty Commission to do a little podcast for Christ and Pop Culture. We chatted evangelism, the new cultural situation we find ourselves in, and whether or not we’re “losing a generation.” It was a good time.

You can go listen to it here at the Christ an Pop Culture site.

You can also go check out Dan Darling’s CNN article on the same subject here.

Soli Deo Gloria

by Derek Rishmawy at April 11, 2014 02:35 PM

The Tech Report - News

Windows 8.1 Update failing for many users

Microsoft released a big Windows 8.1 update earlier this week. Dubbed simply Windows 8.1 Update, the patch contains a myriad of little tweaks designed to improve the UI for desktop users. It sounds like a prudent update to the operating system, but I can't say for sure, because the installer keeps failing on me—and I'm not the only one. As InfoWorld reports, numerous users are having problems applying Windows 8.1 Update. Microsoft's support forum is loaded with posts complaining about issues, and so is Twitter.

Folks seem to be getting hit with a few different error codes. One report even claims the update failed after a clean OS reinstall. ...


April 11, 2014 02:26 PM

Justin Taylor

Logic on Fire: The Life & Legacy of Martyn Lloyd-Jones

A forthcoming documentary:

If you preorder a digital download of this film by midnight (Central Time) tonight (Friday, April 11, 2014), you will be entered to win Lloyd-Jones’s complete 14-volume commentary set on Romans (which retails for $369).

Here is some more information on the project from Jonathan Catherwood, president of the MLJ Trust and grandson of the Doctor.

Dear Friend,

We wanted to let you know about some exciting news. A Christian filmmaker called Matthew Robinson is making a documentary on Martyn Lloyd-Jones entitled “Logic on Fire: the Life and Legacy of Dr. Martyn Lloyd-Jones.” The website for the film is News of the film was announced at this week’s “Together for the Gospel” gathering in Kentucky, right before a panel on the influence of Martyn Lloyd-Jones on the evangelical community.

Matt and his team have already started filming interviews with Christian leaders such as Iain Murray (who wrote the authorized two volume biography on Dr. Lloyd-Jones [see here]), and will in Britain later this month interviewing members of the Lloyd-Jones family and visiting key locations in the life of Dr. Lloyd-Jones in England and Wales. The film is due for release in 2015.

We also wanted to let you know that while we are very excited about this project, and have greatly appreciated the care that Matt has taken to make sure that Dr. Lloyd-Jones’s descendants and the Trust are comfortable with his approach, this is an independent venture. The MLJ Trust is not funding the film and will not be receiving any proceeds from it. Our mission is to make the 1,600 audio sermons of Dr. Lloyd-Jones available at for anyone who wants them at no cost, and that is where all our efforts and resources are focused. Our hope is that interest in the film will lead to interest in the Gospel message contained in all of Dr. Lloyd-Jones’s sermons.

Every blessing to you,

Jonathan Catherwood
President; MLJ Trust

by Justin Taylor at April 11, 2014 02:25 PM

The Ontological Geek | The Ontological Geek

Ladies’ Man: Womanizing in the Witcher

I wasn’t far into the story when it became clear that The Witcher was going to be a guilty pleasure. This realization came when I had my first romantic encounter.

by Alex Duncan at April 11, 2014 02:24 PM

Zippy Catholic

End Game

We are constantly being assured that Game teaches men things that nobody else teaches, so that men who want to learn these things specifically have nowhere else to go other than pickup artists.  That means that what Game teaches must have a specific difference from what it has been possible to learn elsewhere in the decades before the “Game renaissance” on the web. And Game must be something empowering: even if, according to its best practitioners, it only works as well as a placebo, men would still see results from adopting it.

So what actually is the specific difference between social competence in general and Game more specifically?  What empowering techniques can you not learn from any sources other than pickup artists and sluts?

The specific things you won’t learn from sources other than pickup artists and sluts are the things specific to pickup artists and sluts: unchaste behaviors toward the opposite sex.

by Zippy at April 11, 2014 02:18 PM

One Thing Well

Chameleon SSD Optimizer

Chameleon SSD Optimizer:

Chameleon is an optimization tool for Solid State Drive on Mac OS X system. It can enable TRIM on non Apple-branded disks. Now you can also increase durability reducing I|O writing cycles, set hibernate mode and save space disabling sleep image.

April 11, 2014 02:00 PM


eBook Sale: Study Resources & NIV Bibles, Ends 4/13/14


We'd like to share the Study Resources and Bibles eBook sale with you for two reasons:

  1. These resources will help you and your people dig deeper into the Bible.
  2. The sale ends very soon: Sunday, April 13 (2014).

Visit the sale now.

Got the eBook Sale


Some Highlights from the eBook Sale

Sermon on the Mount

Story of God Bible Commentary: Sermon on the Mount

By Scot McKnight

"The Sermon on the Mount is the moral portrait of Jesus’ own people," writes Scot McKnight. This commentary will show you why. The sale includes another volume from this new commentary series: Philippians by Lynn H. Cohick. Go to sale


Grasping God's Word

Grasping God's Word: A Hands-on Approach to Reading, Interpreting, and Applying the Bible (3rd ed.)

By J. Scott Duvall and J. Daniel Hays

A solid text that will help beginning seminary students, college students, and other serious readers get a grip on God's Word.
Go to sale


Illustrated Bible Dictionary

Zondervan Illustrated Bible Dictionary

By Michael Bird

Over 7200 entries plus photographs, charts, illustrations, maps and more. Incidentally, Olive Tree's download of this book for their free Bible study app is a perennial bestseller. Go to eBook sale


How to Read the Bible for All Its Worth

How to Read the Bibles for All Its Worth (3rd ed.)

By Gordon Fee and Douglas Stewart

Used worldwide, this is the classic book for students and thoughtful readers, who will discover tools for understanding the meaning of Scripture and applying it to their lives in the 21st century. 
Go to  sale


NIV Leadership Bible

NIV Leadership Bible: Leading by the Book

Edited by Sidney Buzzell, William Perkins, Kenneth D. Boa

Want to grow your leadership skills? This new Bible includes 52 weeks of biblically based lessons, helping you develop biblical values (humility, courage) and skills (systems thinking, conflict management). This Bible was just newly converted to eBook.
Go to sale


You’ll find more in this eBook sale: bestselling study Bibles, commentaries, Bible dictionaries and encyclopedias, and lay-friendly resources for learning hermeneutics.

Get the deals today.


by Zondervan at April 11, 2014 01:33 PM


The Apple Doesn’t Fall Far From The Tree

The Apple Doesn't Fall Far From The Tree

No apples were harmed in the making of this comic.

Also, big thanks to Robin for fixing our server issues, again!

by DOGHOUSE DIARIES at April 11, 2014 01:32 PM

Crossway Blog

Making Disciples Like a Soldier, Athlete, Farmer . . . Mom


This is a guest post by Gloria Furman. Her newest book is Treasuring Christ When Your Hands Are Full: Gospel Meditations for Busy Moms.

Vomit and Verses

“Why do they always come to my side of the bed when they need to throw up?” I asked my husband an honest question.

One of our kids was sick that day, so it was time to put on the Nurse Mommy hat (and poncho). I changed cold compresses on her head, rinsed out her vomit bucket, held a drinking straw to her lips, kept the healthy kids under control, and made a mental note to call my own mom and thank her for all the times she did this for me.

Somewhere in the middle of laundering soiled bed sheets, I read that day’s Scripture reading: 2 Timothy 2. In this chapter, Paul tells Timothy that he must rely on God’s grace to fulfill his calling as a disciple-maker. And here I was up to my elbows with a very tangible illustration of my own need for grace to do what God has called me to do.

The daily (and nightly) disciple-making work of mothering makes us increasingly aware of our need to be “strengthened by the grace that is in Christ Jesus” (2 Timothy 2:1).

Disciple-Making Mothers

Even though the metaphors Paul uses in this particular passage aren’t your typical descriptions of motherhood, it’s not hard to catch a glimpse of disciple-making motherhood through the eyes of a soldier, athlete, and farmer (2 Timothy 2:1-7).

A mother participates in making disciples of Jesus as she invests her life in the work of evangelizing and discipling her children in every day life. It has been said that moms lead more people to Jesus than do evangelistic events and outreach programs. Consider the examples that Paul uses to encourage Timothy in his calling and see how they relate to motherhood. Like faithful soldiers, we are diligent even with the small things because we aim to please the Lord. Like persistent athletes who compete with integrity, we aim to stay focused on what the Lord has given us to do. Like hard-working farmers, we invest our everyday lives into our children and pray expectantly that God would produce fruit that will last.

You may have read on a greeting card somewhere that motherhood is not for the faint of heart. But I don’t buy it; motherhood must be for the faint of heart. Disciple-making like a soldier/athlete/farmer-mom means that we need to be strengthened by God’s grace to do the routine, hard work that moves the gospel forward.

Mary Had a Little Lamb

Bringing order out of kitchen chaos and subduing the algebra homework with your kids sounds like a demeaning job description to the world. But the disciple-making work of motherhood is part of something bigger than simply keeping the dust bunnies at bay.

We nurture life in light of the long-view of motherhood. When we look out and see the effects of the Fall “as sin reigned in death,” we don’t despair. We look to the cross and remember that, because of Jesus’ substitutionary death, “grace also might reign through righteousness leading to eternal life through Jesus Christ our Lord” (Romans 5:21).

We nurture life in the face of death to the praise of God’s glory in all of our work. Being “pro” life means you are for that person’s life (especially their spiritual life) in every way and at all times. And there’s no way a busy mom can love like that unless she sees how she has been loved like that.

We are loved by God in ways that our sin-besotted hearts cannot comprehend. Even after our first parents sinned in the Garden and God justly pronounced a curse, a blessing for mankind could still be heard. God promised a Rescuer. “I will put enmity between you and the woman, and between your offspring and her offspring; he shall bruise our head, and you shall bruise his heel” (Genesis 3:15).

Thousands of years later, a virgin gave birth to a son. His tiny little feet would ultimately turn into serpent-crushing feet. God sent his Son, Jesus, to do the work of subduing his enemy and rescuing his children by dying on the cross and rising from the dead. The long-view of motherhood looks way past potty training and high school graduation to scan the horizon of eternity, where the incarnate Son of God is risen and reigning.

Making Disciples in the Living Room

Our children are included in panta ta ethne (“all the nations”) of the Great Commission. Jesus’s assurances that he has been given “all authority in heaven and on earth” (Matthew 28:18) and that he is with us always “to the end of the age” (Matthew 28:20) are ours for the keeping and believing in our everyday disciple-making.

The world isn’t going to give you a medal for wiping vomit off the floor. Paparazzi aren’t waiting in the bushes outside your house trying to snap a photo of you praying for your kids. But your prayer-full, hope-filled work of evangelism and discipleship—done through the strengthening grace of Jesus—gives him praise that echoes in eternity.

And that moves all of heaven to rejoice.

Gloria Furman is a wife, mother of four young children, doula, and blogger. In 2008 her family moved to the Middle East to plant Redeemer Church of Dubai where her husband, Dave, serves as the pastor. She is the author of Glimpses of Grace and Treasuring Christ When Your Hands Are Full, and she blogs regularly at Desiring God, The Gospel Coalition, and

by Crossway Author at April 11, 2014 01:30 PM

John C. Wright's Journal

Night Land Day APPROACHES!

Here is the link: 

Monday, the Feast of Saint Bezeret, is the official day for AWAKE IN THE NIGHT LAND to go on sale, but for you, my cherished readers, here is preemptive early roll out.

To my fan: use the link! be the first to write a flattering review, before the hate-filled Lefty trolls roll in and smear the page, as they did on my Goodreads page. Amazon never removes any material, no matter how vile, false and rude.

AWAKE IN THE NIGHT LAND is an epic collection of four of John C. Wright’s brilliant forays into the dark fantasy world of William Hope Hodgson’s 1912 novel, The Night Land. Part novel, part anthology, the book consists of four related novellas, “Awake in the Night”, “The Cry of the Night-Hound”, “Silence of the Night”, and “The Last of All Suns”, which collectively tell the haunting tale of the Last Redoubt of Man and the end of the human race. Widely considered to be the finest tribute to Hodgson ever written, the first novella, “Awake in the Night”, was previously published in 2004 in The Year’s Best Science Fiction: Twenty-First Annual Collection. AWAKE IN THE NIGHT LAND marks the first time all four novellas have been gathered into a single volume.

Awake in the Night Land_final

John C. Wright has been described by reviewers as one of the most important and audacious authors in science fiction today. In a recent poll of more than 1,000 science fiction readers, he was chosen as the sixth-greatest living science fiction writer.


by John C Wright at April 11, 2014 01:25 PM

Zippy Catholic

Nominalism and the avoidance of specific difference

Nominalism is all about avoiding the implications of the essences of things, and one of the easiest ways to avoid the implications of essences is to obfuscate specific differences.

Take adultery, for example.  Adultery has an essence, and the specific difference between sex acts generally and adultery in particular is that adultery takes place between partners at least one of whom is married to someone else.

Or take a contracepted sex act.  The specific difference between sex acts generally and contracepted sex acts in particular is that the latter have been modified in some way which blocks natural fertility.

I myself have concluded, after a couple of years of experience with the subject, that the specific difference between Game/sluttiness and social competence more generally is inchastity.

What nominalists do in order to avoid judgment of the things they support is obfuscate specific differences.  That is why we are constantly being cajoled into assent to trite slogans (“social competence is good!”; “psychological knowledge can be used for good or evil!” etc.) and told that that is Game.

by Zippy at April 11, 2014 01:24 PM

Proper Fixation

Working simultaneously vs waiting simultaneously

"Multiprocessing", "multi-threading", "parallelism", "concurrency" etc. etc. can give you two kinds of benefits:

  • Doing many things at once - 1000 multiplications every cycle.
  • Waiting for many things at once - wait for 1000 HTTP requests just issued.

Some systems help with one of these but not the other, so you want to know which one - and if it's the one you need.

For instance, CPython has the infamous GIL - global interpreter lock. To what extent does the GIL render CPython "useless on multiple cores"?

  • Indeed you can hardly do many things at once - not in a single pure Python process. One thread doing something takes the GIL and the other thread waits.
  • You can however wait for many things at once just fine - for example, using the multiprocessing module (, or you could spawn your own thread pool to do the same. Many Python threads can concurrently issue system calls that wait for data - reading from TCP sockets, etc. Then instead of 1000 request-wait, request-wait steps, you issue 1000 requests and wait for them all simultaneously. Could be close to a 1000x speed-up for long waits (with a 1000-thread worker pool; more on that below). Works like a charm.

So GIL is not a problem for "simultaneous waiting" for I/O. Is GIL a problem for simultaneous processing? If you ask me - no, because:

  • If you want performance, it's kinda funny to use pure Python and then mourn the fact that you can't run, on 8 cores, Python code that's 30-50x slower than C to begin with.
  • On the other hand, if you use C bindings, then the C code could use multiple threads actually running on multiple cores just fine; numpy does it if properly configured, for instance. Numpy also uses SIMD/vector instructions (SSE etc.) - another kind of "doing many things at once" that pure Python can't do regardless of the GIL.

So IMO Python doesn't have as bad a story in this department as it's reputed to have - and if it does look bad to you, you probably can't tolerate Python's slowness doing one thing at a time in the first place.

So Python - or C, for that matter - is OK for simultaneous waiting, but is it great? Probably not as great as Go or Erlang - which let you wait in parallel for millions of things. How do they do it? Cheap context management.

Context management is a big challenge of waiting for many things at once. If you wait for a million things, you need a million sets of variables keeping track of what exactly you're waiting for (has the header arrived? then I'm waiting for the query. has it arrived? then I ask the database and wait for it etc. etc.)

If those variables are thread-local variables in a million threads, then you run into one of the problems with C - and hence OS-supported threads designed to run C. The problem is that C has no idea how much stack it's gonna need (because of the halting problem, so you can't blame C); and C has no mechanism to detect that it ran out of stack space at runtime and allocate some more (because that's how its ABIs have evolved; in theory C could do this, but it doesn't.)

So the best thing a Unixy OS could do is, give C one page for the stack (say 4K), and make say the next 1-2M of the virtual address space unaccessible (with 64b pointers, address space is cheap). When C page-faults upon stack overflow, give it more physical memory - say another 4K. This method means at least 4K of allocated physical memory per thread, or 4G for a million threads - rather wasteful. (I think in practice it's usually way worse.) All regardless of us often needing a fraction of that memory for the actual state.

And that's before we got to the cost of context switching - which can be made smaller if we use setjmp/longjmp-based coroutines or something similar, but that wouldn't help much with stack space. C's lax approach to stack management - which is the way it is to shave a few cycles off the function call cost - can thus make C terribly inefficient in terms of memory footprint (speed vs space is generally a common trade-off - it's just a bad one in the specific use case of "massive waiting" in C).

So Go/Erlang don't rely on the C-ish OS threads but roll their own - based on their stack management, which doesn't require a contiguous block of addresses. And AFAIK you really can't get readable and efficient "massive waiting" code in any other way - your alternatives, apart from the readable but inefficient threads, are:

  • Manual state machine management - yuck
  • Layered state machines as in Twisted - better, but you still have callbacks looking at state variables
  • Continuation passing as in Node.js - perhaps nicer still, but still far from the smoothness of threads/processes/coroutines

The old Node.js slides say that "green threads/coroutines can improve the situation dramatically, but there is still machinery involved". I'm not sure how that machinery - the machinery in Go or Erlang - is any worse than the machinery involved in continuation passing and event loops (unless the argument is about compatibility more than efficiency - in which case machinery seems to me a surprising choice of words.)

Millions of cheap threads or whatever you call them are exciting if you wait for many events. Are they exciting if you do many things at once? No; C threads are just fine - and C is faster to begin with. You likely don't want to use threads directly - it's ugly - but you can multiplex tasks onto threads easily enough.

A "task" doesn't need to have its own context - it's just a function bound to its arguments. When a worker thread is out of work, it grabs the task out of a queue and runs it to completion. Because the machine works - rather than waits - you don't have the problems with stack management created by waiting. You only wait when there's no more work, but never in the middle of things.

So a thread pool running millions of tasks doesn't need a million threads. It can be a thread per core, maybe more if you have some waiting - say, if you wait for stuff offloaded to a GPU/DSP.

I really don't understand how Joe Armstrong could say Erlang is faster than C on multiple cores, or things to that effect, with examples involving image processing - instead of event handling which is where Erlang can be said to be more efficient.

Finally, a hardware-level example - which kind of hardware is good at simultaneous working, and which is good at simultaneous waiting?

If your goal is parallelizing work, eventually you'll deteriorate to SIMD. SIMD is great because there's just one "manager" - instruction sequencer - for many "workers" - ALUs. CPUs, DSPs and GPUs all have SIMD. NVIDIA calls its ALUs "cores" and 16-32 ALUs running the same instruction "threads", but that's just shameless marketing. A "thread" implies, to everyone but a marketeer, independent control flow, while GPU "threads" march in lockstep.

In practice, SIMD is hard despite thousands of man-years having been invested into better languages and libraries - because telling a bunch of dumb soldiers marching in lockstep what to do is just harder than running a bunch of self-motivating threads each doing its own thing.

(Harder in one way, easier in another: marching in lockstep precludes races - non-deterministic, once-in-a-blue-moon, scary races. But races of the kind arising between worker threads can be almost completely remedied with tools. Managing the dumb ALUs can not be made easier with tools and libraries to the same extent - not even close. Where I work, roughly there's an entire team responsible for SIMD programming, while threading is mostly automatic and bugs are weeded out by automated testing.)

If, however, you expect to be waiting much of the time - for memory or for high-latency floating point operations, for instance - then hoards of hardware threads lacking their own ALUs, as in barrel threading or hyper-threading, can be a great idea, while SIMD might do nothing for you. Similarly, a bunch of weaker cores can be better than a smaller number of stronger cores. The point being, what you really need here is a cheap way to keep context and switch between contexts, while actually doing a lot at once is unlikely to be possible in the first place.


  • Doing little or nothing while waiting for many things is both surprisingly useful and surprisingly hard (which took me way too long to internalize both in my hardware-related and software/server-related work). It motivates things looking rather strange, such as "green threads" and hardware threads without their own ALUs.
  • Actually doing many things in parallel - to me the more "obviously useful" thing - is difficult in an entirely different way. It tends to drag in ugly languages, intrinsics, libraries etc. about as much as having to do one single thing quickly. The "parallelism" part is actually the simplest (few threads so easy context management; races either non-existent [SIMD] or very easy to weed out [worker pool running tasks])
  • People doing servers (which wait a lot) and people doing number-crunching (work) think very differently about these things. Transplanting experience/advice from one area to the other can lead to nonsensical conclusions.

See also

Parallelism and concurrency need different tools - expands on the reasons for races being easy to find in computational code - but impossible to even uniformly define for most event handling code.

by Yossi Kreinin at April 11, 2014 01:24 PM


Friday Oddness: Tom Lehrer

Here's something for your Friday, a in depth article about the work and legacy of Tom Lehrer (surprisingly in depth and interesting for a buzzfeed article.)

When Lehrer entered graduate school in 1946 — at 18 — he found himself at the center of a group of friends who called themselves the “Graduate Gang.” They amused themselves with the quizzes, crossword puzzles, and math games they brought to their dinners in the Harvard dining hall. It was, in retrospect, a gilded circle: One member, Philip Warren Anderson, went on to win a Nobel Prize in physics; Lewis Branscomb served as the chief scientist of IBM; and Robinson was an executive director of the Carnegie Corporation.

“Tom was the intellectual leader in the sense that he was the funniest and he would come up with cuter problems,” Robinson said, adding that when Anderson wrote his 50th Anniversary Report for the Harvard class of ‘94, he’d recall: “When I was a student with Tom Lehrer…”
Within the year, though, his graduate group of friends had begun to trickle out of Cambridge, first the chemist, then the physicists, then the historians who took much longer to get a Ph.D. As a souvenir for them, Lehrer decided in 1953 to make a record of the songs he had written at Harvard. He recorded Songs by Tom Lehrer in one session at Trans Radio studios in Boston on a 10-inch LP. He wrote the liner notes himself, called upon the wife of Robinson’s boss to do the illustrations, and had the covers printed at Shea Brothers printers near Harvard Square, just up the street from where he and Robinson shared a room on the third floor of a house.
By 1954 — when he was trying to avoid the draft by working for a defense contractor — he had sold 10,000 records. He had also quickly dissolved Lehrer Music, of which he was president, in December for “various reasons,” among them: “Certain stockholders objected to the president’s face.” He gave up and shipped off to Fort Meade in 1955, an early officer in the National Security Agency. (He is believed, during that time, to have invented vodka Jell-O shots.) By the end of the decade, he had sold 370,000 records.
Yet despite his enormous success, global popularity, and the release of his second album, More Songs by Tom Lehrer that year, it was exactly at this time that Lehrer first told Robinson he wanted to stop performing. Lehrer has told friends and various interviewers that he didn’t enjoy “anonymous affection.” And while his work was widely enjoyed at the time, it was also something of a scandal — the clever songs about math and language were for everyone, but Lehrer’s clear-eyed contemplation of nuclear apocalypse was straightforwardly disturbing....

People would always ask him: “What do you want to do as a career?” Robinson said.
“What’s wrong with graduate school as a career?!” Lehrer would respond. He spent some 15 years working on and off on his dissertation, until he finally gave it up in 1965.
The space for one of the animating forces in Lehrer’s music, his liberal politics, was shrinking too. Lehrer was a hero of the anti-nuclear, civil rights left; he occupied the bleeding edge of the elite liberalism of the day. “I Wanna Go Back to Dixie” minces no words in its scorn for the industry of American nostalgia, and particularly for the American South: “I wanna talk with Southern gentlemen / And put my white sheet on again / I ain’t seen one good lynchin’ in years … The land of the boll weevil / Where the laws are medieval / Is callin’ me to come and nevermore roam.”
But his left was the square, suit-wearing, high-culture left. His circle at Harvard included Arthur Schlesinger Jr., the renowned historian, JFK biographer, and then-nominal chairman of the Cambridge chapter of Americans for Democratic Action. His political hero was Adlai Stevenson, the Democratic Party’s presidential candidate in 1952 and 1956, the man whom Richard Nixon damagingly dismissed as an “egghead.”
Stevenson’s losing battle marked the end of a political tradition, and also the beginning of the end of a kind of Ivy League liberal intellectualism’s place atop the Democratic Party. What was coming was the New Left and the counterculture, something whose aesthetics Lehrer couldn’t stand, even if their politics weren’t necessarily at odds.
“It takes a certain amount of courage to get up in a coffeehouse or a college auditorium and come out in favor of the things that everybody else in the audience is against, like peace and justice and brotherhood and so on,” he deadpans in his introduction to the whiny “Folk Song Army” on That Was the Year That Was. “We are the folk song army / Everyone of us cares / We all hate poverty, war, and injustice / Unlike the rest of you squares.”
The New Left agreed with Lehrer on Vietnam. His last public performance, in fact, was on a fundraising tour for George McGovern in 1972. But the singer — who saw himself as “a liberal, one of the last” — felt less at home in the new Democratic Party. In the end, Stevenson’s party, and Lehrer’s, lost — and with it, at least to Lehrer’s mind, a prevailing sense of humor. “Things I once thought were funny are scary now,” he told People magazine in 1982. “I often feel like a resident of Pompeii who has been asked for some humorous comments on lava.”
”The liberal consensus, which was the audience for this in my day, has splintered and fragmented in such a way that it’s hard to find an issue that would be comparable to, say, lynching,” he also told the New York Times in Purdum’s 2000 article, which was part of his last round of interviews to promote an anthology of his work. ”Everybody knows that lynching is bad. But affirmative action vs. quotas, feminism vs. pornography, Israel vs. the Arabs? I don’t know which side I’m on anymore. And you can’t write a funny song that uses, ‘On the other hand.”’

When I was young, my family had a copy of "An Evening Wasted with Tom Lehrer", and I used to listen to it often, belting out Poisoning Pigeons in the Park and The Masochism Tango in childish tones. I'd never particularly noticed Lehrer's politics, though as a kid and listening forty years after his heyday, that's hardly surprising.

by Darwin ( at April 11, 2014 01:12 PM

Orthodoxy and Heterodoxy

Did Jesus Have a Wife? What Recent Analysis of “The Gospel of Jesus’ Wife” Really Means

After a seemingly interminable delay, the results are finally in! According to many news outlets, the Coptic fragment named “The Gospel of Jesus’ Wife” (GJW) has been shown to be a genuine text. Or has it? Of course the devil is in the details, and there are many of them that nuance the purported authentication … … Continue reading

by Eric Jobe at April 11, 2014 01:08 PM

CrossFit Naptown

HERO WOD: “Daniel”



For time:
50 pull-ups
400-meter run
95-lb. thruster, 21 reps
800-meter run
95-lb. thruster, 21 reps
400-meter run
50 pull-ups

Note: Please be prepared to scale pullups in this workout.



With heavy hearts we dedicate this workout to Army Sgt 1st Class Daniel Crabtree who was killed in Al Kut, Iraq on Thursday June 8th. To Daniel’s family and friends, we express our sorrow; to his wife Kathy and daughter Mallory, we tearfully acknowledge your loss as the true cost of freedom. Fair Winds, Daniel.

Come hang out at the gym Saturday at 1:30pm-4pm.

If you have outdoor games bring them. We will be celebrating the completion of the open and just having a good ole CrossFit Family Party. Bring friends and family and feel free to bring a snack or drink to the party. NO need to RSVP, just show up.

by Coach Jared at April 11, 2014 12:50 PM

The Finance Buff

Save Money Or Make Money

Starting next week, my weekly digest like this one will move to e-mail delivery only. I’m making this change because I want to populate the blog website with only full-featured articles. Regular articles will still be posted to the blog and included in the RSS feed.

Current e-mail subscribers will continue receiving both new blog articles and the digest. No action is required on your part.

If you are reading the blog articles in some other ways, I encourage you to switch over to e-mail subscription so you won’t miss the digest. If you prefer you can also sign up for just the digest.

Because I don’t post often, the digest posts make up 1/3 to 1/2 of the posts on the blog. Although readers like them, search engines only see them as a bunch of links with a short blurb and regard them as “low quality.” Too many “low quality” pages on a site and associated with my name negatively affect my standing. A lower standing means fewer people will be able to find even my good stuff.

Putting the digest through a different channel outside search engines’ purview is my compromise win-win. Readers still get the same content. I get to correct the search engines’ misperception of my site and me as the author.

Besides setting up the new e-mail service, I read these interesting articles this week:

The Great Debate: Save Money or Make Money? by Kali Hawlk at Your Personal Finance Pro

While most people will say both, if I must choose I’d choose make money, but not as in moonlighting or freelancing. Doing a job well will usually work much better. Read my story in about me.


Reader Study: Getting Rich in Manhattan… on $65k/year by MMM at Mr. Money Mustache

This article is actually a math quiz. Can you figure out what’s going on?


Social Security And Early Retirement at Go Curry Cracker

Getting less from Social Security is not a problem for someone retiring early and thus having worked fewer years. A lower average earning over 35 years means you get a better return for your Social Security taxes paid.


Evaluating Exposure To The Alternative Minimum Tax And Strategies To Reduce The AMT Bite by Michael Kitces at Nerd’s Eye View

By coincidence Michael Kitces and I posted about the AMT on the same day. Of course he goes into much more details, including visuals for the phaseout zones.


Tap Retirement Funds/Social Security at Wall Street Journal

Recorded webcast featuring blogger Mike Piper at Oblivious Investor and other guests. It goes with this article The Smart Way to Tap Investment Accounts in Retirement by Andrea Coombes.


The value of enduring advice by Fran Kinniry at Vanguard Blog for Advisors

Vanguard shows where a financial advisor can offer value to a client. If you do those things yourself, you can get most of the value too. For people who would rather outsource, it would be smart to use low-cost good-quality advice.


XY Planning Network and the Future of Getting Paid For Financial Planning by Michael Kitces at Nerd’s Eye View

The most value in having a financial advisor is the financial advice part, but most people who have an advisor are paying much more for the much less valuable asset management part (administrative chores). I’m glad to see this new venture co-founded by Michael Kitces focusing on the advice part. I wish it every success.


Does Asset Location Make Sense? by Rick Ferri at

If you’ve been reading my blog for some time, you know where I stand already. At low interest rates, it can make more sense to use taxable accounts for [muni] bonds. When rates go up, it would be easy to switch. Not so easy to switch if you have it the other way around.


Plan Administrator Due Diligence Safe Harbors Developed for Rollover Verifications (Rev. Rul. 2014-9) by CCH Group

Rolling over pre-tax money in traditional IRAs to an employer-sponsored plan is a pre-requisite for doing the backdoor Roth. Most plans already accept incoming rollovers from traditional IRAs. For those that still don’t, the IRS provided guidance to help ease their concerns. So hopefully even more plans will do.


Special CDs with Top Rates at Patriot FCU in Parts of PA and MD by Ken Tumin at

If you missed the 3% 5-year CD from PenFed in January here’s another chance but only if you are in certain parts of PA or MD.


Thank you for reading. Enjoy your weekend.

P.S. If you are not getting this by e-mail right now, remember to sign up for e-mail subscription: either everything or just the digest.

See All Your Accounts In One Place

Track your net worth, asset allocation, and portfolio performance with FREE financial tools from Personal Capital.

Save Money Or Make Money is copyrighted material from The Finance Buff. All rights reserved. ( b87e8215d24496480249d6aaf20c77ea )

by Harry Sit at April 11, 2014 12:25 PM

sacha chua :: living an awesome life

Working fast and slow

When it comes to personal projects, when does it make sense to work quickly and when does it make sense to work slowly? I’ve been talking to people about how they balance client work with personal projects. It can be tempting to focus on client work because that comes with clear tasks and feedback. People’s requests set a quick pace. For personal projects, though, the pace is up to you.

It’s easy to adopt the same kinds of productivity structures used in the workplace. You can make to-do lists and project plans. You can set your own deadlines. I want to make sure that I explore different approaches, though. I don’t want to just settle into familiar patterns.

2014-04-07 Working fast and slow #experiment

2014-04-07 Working fast and slow #experiment

I work on personal projects more slowly than I work on client projects. When I work on client tasks, I search and code and tweak at a rapid speed, and it feels great to get a lot of things done. My personal projects tend to be a bit more meandering. I juggle different interests. I reflect and take more notes.

Probably the biggest difference between client work and personal projects is that I tend to focus on one or two client tasks at a time, and I let myself spread out over more personal projects. I cope with that by publishing lots of little notes along the way. The notes make it easier for me to pick up where I left off. They also let other people learn from intermediate steps, which is great for not feeling guilty about moving on. (Related post: Planning my learning; it’s okay to learn in a spiral)

Still, it’s good to examine assumptions. I assume that:

  • doing this lets me work in a way that’s natural to me: what if it’s just a matter of habit or skill?
  • it’s okay to be less focused or driven in my learning, because forcing focus takes effort: it’s probably just the initial effort, though, and after that, momentum can be useful
  • combinations of topics can be surprisingly interesting or useful: are they really? Is this switching approach more effective than a serial one or one with larger chunks?
  • a breadth-first approach is more useful to me than a depth-first one: would it help to tweak the depth for each chunk?
2014-04-02 On thinking about a variety of topics - a mesh of learning #my-learning

2014-04-02 On thinking about a variety of topics – a mesh of learning #my-learning

One of my assumptions is that combining topics leads to more than the sum of the parts. I took a closer look at what I write about and why. What do I want from learning and sharing? How can I make things even better?

2014-04-02 Evaluating my sharing #sharing #decision

2014-04-02 Evaluating my sharing #sharing #decision

Emacs tinkering is both intellectually stimulating and useful to other people. It also works well with applied rationality, Quantified Self, and other geekery. I can align sketchnoting by focusing on technical topics and  on making it easier to package things I’ve learned. Blogging and packaging happen to be things I’ve been learning about along the way. Personal finance is a little disconnected from other topics, but we’ll see how this experiment with the Frugal FIRE show works out.

If I had to choose one cluster of topics, though, it would be the geek stuff. I have the most fun exploring it, and I am most interested in the conversations around it.

What does that mean, then? Maybe I’ll try the idea of a learning sprint: to focus all (or almost all) my energies on one topic or project each week. I can work up to it gradually, starting with 2-4 hour blocks of time.

2014-04-02 Imagining learning sprints #my-learning

2014-04-02 Imagining learning sprints #my-learning

Because really, the rate-limiting factor for my personal projects is attention more than anything else. If I experiment with reducing my choices (so: Emacs basics, Emacs chats, open source, Quantified Self), that will probably make it easier to get the ball rolling.

2014-03-28 Identifying rate-limiting factors in my work #kaizen

2014-03-28 Identifying rate-limiting factors in my work #kaizen

So I’m still not adopting the taskmaster approach, but I’m reminding myself of a specific set of areas that I want to explore, gently guiding the butterflies of my interest down that way.

We’ll see how it works out!

The post Working fast and slow appeared first on sacha chua :: living an awesome life.

by Sacha Chua at April 11, 2014 12:00 PM

One Thing Well



WGif is a command line tool for creating animated GIFs from YouTube videos.

April 11, 2014 12:00 PM

Schneier on Security

Police Disabling Their Own Voice Recorders

This is not a surprise:

The Los Angeles Police Commission is investigating how half of the recording antennas in the Southeast Division went missing, seemingly as a way to evade new self-monitoring procedures that the Los Angeles Police Department imposed last year.

The antennas, which are mounted onto individual patrol cars, receive recorded audio captured from an officer’s belt-worn transmitter. The transmitter is designed to capture an officer’s voice and transmit the recording to the car itself for storage. The voice recorders are part of a video camera system that is mounted in a front-facing camera on the patrol car. Both elements are activated any time the car’s emergency lights and sirens are turned on, but they can also be activated manually.

According to the Los Angeles Times, an LAPD investigation determined that around half of the 80 patrol cars in one South LA division were missing antennas as of last summer, and an additional 10 antennas were unaccounted for.

Surveillance of power is one of the most important ways to ensure that power does not abuse its status. But, of course, power does not like to be watched.

by schneier at April 11, 2014 11:41 AM

The Tech Report - News

You can get your own Google Glass specs on April 15

Dying to try Google Glass? Until now, the only way to get your hands on Google's $1500 cyborg specs was to fill out a form and wait for a spot to open up in the Glass Explorer Program. For a limited time next week, however, Google plans to let people in without a waiting period. Here's the skinny, according to the Glass Google+ account:

Too bad for folks who live outside the states—Google says it's "just not ready yet to bring Glass to other countries." And, you know, ...


April 11, 2014 11:00 AM

Greg Mankiw's Blog

Next time you hear someone advocate for single-payer healthcare, remember this

From the NY Times:
Two Florida doctors who received the nation’s highest Medicare reimbursements in 2012 are both major contributors to Democratic Party causes, and they have turned to the political system in recent years to defend themselves against suspicions that they may have submitted fraudulent or excessive charges to the federal government.... 
Topping the list is Dr. Salomon E. Melgen, 59, an ophthalmologist from North Palm Beach, Fla., who received $21 million in Medicare reimbursements in 2012 alone....  
Dr. Melgen’s firm donated more than $700,000 to Majority PAC, a super PAC run by former aides to the Senate majority leader, Harry Reid, Democrat of Nevada. The super PAC then spent $600,000 to help re-elect Senator Robert Menendez, Democrat of New Jersey, who is a close friend of Dr. Melgen’s. Last year, Mr. Menendez himself became a target of investigation after the senator intervened on behalf of Dr. Melgen with federal officials and took flights on his private jet.

by Greg Mankiw ( at April 11, 2014 09:18 AM

Kevin DeYoung

Repent, for the Kingdom of Heaven Is at Hand

Revelation 9:20-21 “The rest of mankind, who were not killed by these plagues, did not repent of the works of their hands nor give up worshiping demons and idols of gold and silver and bronze and stone and wood, which cannot see or hear or walk, nor did they repent of their murders or their sorceries or their sexual immorality or their thefts.”

God’s word to the peoples of the world is not only an offer of grace, nor even less simply a call to live rightly, nor even less still a promise to make all our dreams come true if we just have faith. We have not heard all that God wants to say to us unless we have heard his command to repent.

Ezekiel said “Repent and turn from your transgressions” (Ezek. 18:30).  John the Baptist said “Repent, for the kingdom of heaven is at hand” (Matt. 3:2).  Jesus said “Repent and believe in the gospel” (Mark 1:15).  Peter said “Repent and be baptized” (Acts 2:38).  And Paul said God “commands all people everywhere to repent” (Acts 17:30).

Repentance has never been easy. No one likes to be told “Die to yourself.  Kill that in you.  Admit you are wrong and change.”  That’s never been an easy sell. It’s much easier to get a crowd by leaving out the repentance part of faith, but it’s not faithful. It’s not even Christianity. Of course, there is a whole lot more to following Jesus than repentance, but it’s certainly not less.  “Repent,” Jesus said, or “you will all likewise perish” (Luke 13:5).

Repentance has always been hard, always will be hard.

Regret, now that’s easy. Suppose you walk into work one day, furious for no good reason. You get paid well and treated nicely, but you feel like supervisor is unfair. You should have got the promotion, even though you were less qualified and less experienced.  Nevertheless, you march into the office and let your supervisor have it. You tell him where to stick this job. You tell him exactly what you think about him and his wife and mother and his grandmother and his dog. Next thing you know, you’re fired. Later that night, you feel just sick about the whole thing.  How could you have been so stupid to say all that. Now you’re out of work. That’s regret. You don’t have to see your sin or admit wrong and be humbled to feel regret. You just have to feel bad about the consequences of your actions. It’s easy to have regret, but that’s not repentance.

Embarrassment is easy too. Suppose you’re out in the lobby after church and a group of you are chatting about “her.”  No one has talked to “her,” but you all talking about her–what’s wrong in her marriage, what’s wrong with her kids, what’s wrong with her house. You aren’t strategizing how to help her. You’re just talking about her. And then you realize she’s been looking for her coat right behind you the whole time.  She’s heard the whole thing.  And as she bolts out of church crying, you feel just terrible. You are so embarrassed. Now, it may be that you are really struck in your conscience and you are moved to ask for forgiveness. But it could be that you are just embarrassed at being caught. You feel terrible, not so much with having gossiped, but that she heard you gossiping. You wonder what she thinks of you now and if she will tell others about this incident. Sure, you feel terrible, but it’s out of love for your reputation, not out of hatred for sin. You’re simply embarrassed, and that’s not repentance.

Apology is not repentance either.  To be sure, repentance often involves an apology. But just because you’ve issued an apology doesn’t mean you’ve repented.  We’ve all heard and given pseudo-apologies.  “I’m sorry if you were offended.”  “I’m sorry if you took things the wrong way.”  “I’m sorry I said that about your kids.  It’s not that I think their bad kids, their just wild, unruly, and undisciplined.  I’m sorry you’re so sensitive.” Or even when the apology is sincere, it may not be a sincere statement of repentance.  It may just be a sincere statement of feeling remorse or shame.

So regret is easy, embarrassment is easy, and apology is easy.  Repentance, on the other hand, is very hard and, therefore, much rarer.  Repentance involves two things: a change of mind and a change of behavior.

Repentance means you change your mind.  That’s what the Greek word metanoia means– a changed (meta) mind (noia).

You change your mind about yourself: “I am not fundamentally a good person deep down.  I am not the center of the universe.  I am not the king of the world or even my life.”

You change your mind about sin: “I am responsible for my actions.  My past hurts do not excuse my present failings.  My offenses against God and against others are not trivial.  I do not live or think or feel as I should.”

And you change your mind about God: “He is trustworthy.  His word is sure.  He is able to forgive and to save. I believe in his Son, Jesus Christ. I owe him my life and my allegiance. He is my King and my Sovereign, and he wants what is best for me.  I believe it!”

Repentance is hard because changing someone’s mind is hard.  In fact, when we’re dealing with spiritual matters of the heart, God’s the only one who can really change your mind.  People are simply not predisposed to say “I was wrong! I was wrong about God and about myself. My whole way of looking at the world has been in error.  I want to change.”  That’s repentance.  And it’s amazing when it happens.

In the classic book detailing his conversion to Christianity, Orthodoxy, G.K. Chesterton compares his journey to an English yachtsman who slightly miscalculated his course and discovered England, while under the impression he was in the South Seas.  That’s how Chesterton came to Christ.  He rejected Christianity and set out to find what was really true.  And when he found the truth, he discovered he was back home again.  What he found had been there all along.

Here’s how he describes his metanoia, his change of mind:

For if this book is a joke it is a joke against me.  I am the man who with the utmost daring discovered what had been discovered before…No one can think my case more ludicrous than I think it myself; no reader can accuse me here of trying to make a fool of him: I am the fool of this story, and no rebel shall hurl me from my throne.

I freely confess all the idiotic ambitions of the end of the nineteenth century.  I did, like all other solemn little boys, try to be in advance of the age.  Like them I tried to be some ten minutes in advance of the truth.  And I found that I was eighteen hundred years behind it.  I did strain my voice with a painfully juvenile exaggeration in uttering my truths.  And I was punished in the fittest and funniest way, for I have kept my truths: but I have discovered, not that they were not truths, but simply that they were not mine.

When I fancied that I stood alone I was really in the ridiculous position of being backed up by all Christendom…The man from the yacht thought he was the first to find England; I thought I was the first to find Europe. I did try to found a heresy of my own; and when I had put the last touches to it, I discovered that it was orthodoxy.

Chesterton changed his mind.  He admitted he was a fool and a joke.  He had professed all the latest ideas.  But the one thing he was sure was most wrong, ended up being right.

Repentance also involves a change of behavior.  It’s like a train conductor driving his train down the tracks straight for the side of a mountain.  It’s one thing for him to realize and admit that his train his going in the wrong direction.  It’s another thing to stop the train and it get it going in the opposite direction.

As you probably know, prior to becoming a Christian, John Newton was a drunken sailor and a slave trader.  He was converted in a storm at sea.  His life whole did not change at once.  But because his repentance was genuine, he did change.

I stood in need of an Almighty Savior; and such a one I found described in the New Testament.  Thus far, the Lord had wrought a marvelous thing: I was no longer an infidel: I heartily renounced my former profaneness, and had taken up some right notions…I was sorry for my past misspent life, and purposed an immediate reformation.  I was quite freed from the habit of swearing, which seemed to have been as deeply rooted in me as a second nature.  Thus, to all appearance, I was a new man.

From there he had a long road of being transformed from one degree of glory to the next. He changed his mind and his behavior slowly began to follow suit. It took time, but he bore fruit in keeping with repentance (Luke 3:8). He did not change in order to become new, but  that he was a new man had to be proved by his change.

If we preach a “gospel” with no call to repentance we are preaching something other than the apostolic gospel.

If we knowing allow unconcerned, impenitent sinners into the membership and ministry of the church, we are deceiving their souls and putting ours at risk as well.

If we think people can find a Savior without forsaking their sin, we do not know what sort of Savior Jesus Christ is.

There are few things more important in life than repentance.  So important, that Revelation, and the gospels, and the epistles, and the Old Testament make clear that you don’t go to heaven without it.

by Kevin DeYoung at April 11, 2014 09:04 AM

Cool Tools

Logitech B530 USB Headset

I purchased this on a whim a few months ago to use at my office. To say that it was a great investment would be an understatement. Comfortable and flexible, the headset does its job well by providing an incredible sound quality that is unmatched to any other headset I’ve used. The mic is also surprisingly efficient, capturing the smallest or thinnest sounds and making them audible (often to the detriment or embarrassment of the user). I would highly recommend them.

-- Jacob

B530 USB Headset

Available from Amazon

by mark at April 11, 2014 09:00 AM


Chrysologus for Lent XXXVIII

Why should Christians not trust Despair or Unbelief? They are the way Death wages war; with these generals and with these tactics in a battle of this sort she captures, crushes, and kills all those whom nature brings forth into the present life. She holds sway over kings, she conquers peoples, she routs nations. It has never been possible to bribe her with wealth, or to move her by entreaties, or to soften her by tears, or to conquer her by strength.

Sermon 118, section 6.

by Brandon ( at April 11, 2014 08:50 AM

Table Titans

Tales: You Can Taste the Way It Dances


My favorite D&D campaign was one that I based off of a popular MUD that I had played extensively. My players were (unbeknownst to them) the reincarnated heralds of sleeping gods, and their task was to travel across the continent, waking up the gods in order to stop what was essentially Satan from…

Read more

April 11, 2014 07:00 AM

see shy jo

propellor introspection for DNS

In just released Propellor 0.3.0, I've improved improved Propellor's config file DSL significantly. Now properties can set attributes of a host, that can be looked up by its other properties, using a Reader monad.

This saves needing to repeat yourself:

hosts = [ host ""
        & stdSourcesList Unstable
        & Hostname.sane -- uses hostname from above

And it simplifies docker setup, with no longer a need to differentiate between properties that configure docker vs properties of the container:

 -- A generic webserver in a Docker container.
    , Docker.container "webserver" "joeyh/debian-unstable"
        & Docker.publish "80:80"
        & Docker.volume "/var/www:/var/www"
        & Apt.serviceInstalledRunning "apache2"

But the really useful thing is, it allows automating DNS zone file creation, using attributes of hosts that are set and used alongside their other properties:

hosts =
    [ host ""
        & ipv4 ""

        & cname ""
        & Docker.docked hosts "openid-provider"

        & cname ""
        & Docker.docked hosts "ancient-kitenet"
    , host ""
        & Dns.primary "" hosts

Notice that hosts is passed into Dns.primary, inside the definition of hosts! Tying the knot like this is a fun haskell laziness trick. :)

Now I just need to write a little function to look over the hosts and generate a zone file from their hostname, cname, and address attributes:

extractZoneFile :: Domain -> [Host] -> ZoneFile
extractZoneFile = gen . map hostAttr
  where gen = -- TODO

The eventual plan is that the cname property won't be defined as a property of the host, but of the container running inside it. Then I'll be able to cut-n-paste move docker containers between hosts, or duplicate the same container onto several hosts to deal with load, and propellor will provision them, and update the zone file appropriately.

Also, Chris Webber had suggested that Propellor be able to separate values from properties, so that eg, a web wizard could configure the values easily. I think this gets it much of the way there. All that's left to do is two easy functions:

overrideAttrsFromJSON :: Host -> JSON -> Host

exportJSONAttrs :: Host -> JSON

With these, propellor's configuration could be adjusted at run time using JSON from a file or other source. For example, here's a containerized webserver that publishes a directory from the external host, as configured by JSON that it exports:

demo :: Host
demo = Docker.container "webserver" "joeyh/debian-unstable"
    & Docker.publish "80:80"
    & dir_to_publish "/home/mywebsite" -- dummy default
    & Docker.volume (getAttr dir_to_publish ++":/var/www")
    & Apt.serviceInstalledRunning "apache2"

main = do
    json <- readJSON "my.json"
    let demo' = overrideAttrsFromJSON demo
    writeJSON "my.json" (exportJSONAttrs demo')
    defaultMain [demo']

April 11, 2014 05:05 AM

The Gospel Coalition Blog

I Want My Kids Brainwashed

"I don't want to send our son to church to be brain-washed like those Stoddard kids!" our atheist friend said to his wife. He grew up in East Germany, and we had been church-planting in the former East for a few years by then. At first, I was offended that he would view the kids' program at our church as brainwashing. But then, I couldn't forget that he was probably taught Marx's view of religion throughout his life:

Religion is the sigh of the oppressed creature, the heart of a heartless world, just as it is the spirit of a spiritless situation. It is the opium of the people. The abolition of religion as the illusory happiness of the people is required for their real happiness. (Karl Marx, Critique of Hegel's Philosophy of Right)


According to Marx, if people were to truly think for themselves, they'd detox themselves from using the addicting, mind-altering power of religion to numb their pain. But ironically, the effort in East Germany to systematically eradicate religion from society required a new form of brainwashing to inculcate its people with the socialist ideal. An atheistic society was forged, Christian holidays were renamed, and Christian rites such as baptisms, weddings, and confirmation were replaced with socialist ones.

The loss of individualism feared by my friend actually happened in East Germany under the guise of heralding Marxist equality. Socialist brainwashing appeared to be the only solution to the problems caused by Nazi brainwashing. Meanwhile, capitalism and individualism imposed a new and different tyranny of tolerance on the West, at the expense of individual opinion. As we can see, wherever we live, our thinking is a product of our culture, upbringing, and the political system to which we are subjected. Freedom of thought is perhaps an illusion, because we cannot ever think in a vacuum.

Can Our Brains Lead Us to Morality?

With reason as our guide, the so-called Enlightenment argued, we can all become moral, responsible, tolerant good citizens. The Enlightenment called people to trust Reason, and if we could all agree on what is reasonable, we could all live together with a certain set of commonly shared values.

But can logical deductions alone lead us to morality? Though our ability to reason comes from God, we can use this tool to selfish ends, rationalizing all sorts of immoral things by putting ourselves and our needs at the center of reality. This process happens to us as individuals but also to entire cultures and systems. Recently my husband and I visited the Wanssee Haus, a beautiful villa nestled in a rich neighborhood on the shores of Lake Wannsee. There, on January 2, 1942, over breakfast, the most powerful men in Germany master-minded the Endlösung, the final solution for the so-called problem of the Jews in Europe. They drew up an elaborate plan to deport thousands upon thousands to their deaths.

These well-educated men listened to Bach and Mozart but came up with the most morally abject plan of all history. Their "solution" seemed entirely reasonable to them at the time. They led a whole nation astray, and few had the courage to stand up against them. So is it possible for reason to run amok? Yes, according to history.

Do We Need Brain-Washing?

Through a superficial glance at history it becomes painfully clear that Reason alone cannot lead people to be good. Why? Because our ability to reason is radically flawed and limited in scope. Here in Germany we have the Holocaust as a glaring example. But it happens everywhere. Look at "wonderful" ideas such as the Crusades in Europe, the enslavement of Africans in America, the Cultural Revolution in China, the Rwandan genocide, or the recently uncovered North Korean atrocities. In the face of such a vast moral abyss, the doctrine of total depravity, though at first glance seemingly depressing, actually comforts me. It explains the human propensity toward evil. Human beings are not good at the core. If they were, how could we end up such a mess? Most people certainly aren't as bad as they could be, but the fall affected our beings in their totality. Every aspect of who we are as humans is broken: our bodies, our emotions, our sexuality, and our thinking.

We put ourselves at the center of the universe and think more highly of ourselves than we ought. We become our own standard, make our own sense out of this world and only trust our own faulty thinking when it comes to making decisions. This process of neither trusting God nor honoring him in our thinking is foolishly self-centered and leads our hearts down the path to darkness (Rom. 1:21). Paul's solution to this problem is recognizing that our minds are sinful and that the healing of our minds has to come from outside of us. The Holy Spirit must renew them (Rom. 12:2-3).

Paul does not tell us to stop testing, discerning, or judging soberly. But we must do these things in faith, and the outcome of our thinking should be understanding and embracing the will of God, which is good, acceptable, and perfect. If our thinking leads us down any other path, it is most likely self-absorbed and darkened. Our brains cannot lead us to morality, but God's Spirit can!

So should I be offended if someone thinks church is brain-washing my kids? No, on the contrary! Maybe, next time, I can come up with better answer for my critics, not responding with arrogance but with the message of the gospel, namely that "he saved us, not because of works done by us in righteousness, but according to his own mercy, by the washing of regeneration and renewal of the Holy Spirit" (Titus 3:5).

My kids' brains desperately need washing, as does mine. My children were born with intrinsic self-absorption that, if left unchallenged, might lead them down dangerous paths, both for themselves and others around them. But Jesus—the Logos, Reason incarnate—is the only one who has ever thought all of God's thoughts after him in a perfect way. Through his blameless life my kids will know what pleases God, and through his blood their minds can be cleansed.  I pray that someday their minds will be so renewed that they will stand against some of the evils the world around them has embraced without a second thought.

by Eowyn Stoddard at April 11, 2014 05:01 AM

Should Churches Offer Vocational Retraining for Fallen Pastors?

Does financial security prevent ministers from repenting of sin, and if so what should the church do about it? This question assumes that preparation for ministry does not easily translate to other fields, so the economic incentive to hide sin is strong. Thus, the practical question: Should churches offer vocational retraining for fallen pastors?

The stakes are high for a pastor to remain on the straight and narrow. His own testimony, the health of his family and church, and the reputation of Christ are on the line. Of course, this is true for every Christian, but there is a particular urgency for pastors because of their responsibility before almighty God (James 3:1).

unemployed-not-getting-hiredAll these things raise the motivation to hide sin. The fallout of repenting would be nuclear. His personal income is on the line, and thus the security of his family. Unlike the engineer or English professor in the congregation who can fail morally but may be able to get by unfazed professionally, a pastor's earning potential is affected the moment he's discovered.

Finding himself in such a situation, a compromised pastor will simply promise himself (and God) he won't compromise anymore, and that will be the end of whatever vice he's been indulging. But it never works. Unconfessed sin is a sure way both to invite the opposition of God (Psalm 32:3-4) and to harden into self-deception (Hebrews 3:12-13). So should a church have a pre-standing offer of vocational retraining to encourage a compromised pastor to come clean?

Why a Policy Doesn't Work

As a policy, no. The two main purposes for such a policy would be to encourage openness regarding moral failure and to show fairness to a man whose sole training was for ministry related tasks. But such a policy would fail at both purposes. First, the assurance of vocational retraining will not necessarily increase the likelihood of repentance. The genuine conviction of the Holy Spirit will jump a low or a high hurdle all the same. Second, such a policy would rob the congregation of the opportunity to actively love a fallen brother. Vocational retraining would be something he is contractually owed rather than something he is graciously given.

Let me explain both of these points a bit more. First, the promise of financial security beyond ministry will not increase the likelihood of repentance. The assurance of vocational retraining is like a safety net for a well-known tightrope walker. It may spare a broken neck, but it won't save a shattered reputation. The tightrope walker would probably take the broken neck over the negated carrier. The excruciating cost for a pastor confessing his moral failure transcends earning potential—his professional reputation, his marriage and family makeup, his sense of the meaning of his very existence. In other words, there are plenty of other reasons his flesh will find to hide if he is not sincerely convicted by the Holy Spirit.

But if he is, then the world couldn't stop him from repenting. I've watched men face withering consequences for coming to the light, convinced that any earthly consequence was tolerable if the Lord Jesus would spare them from the final judgment. This is the mark, in fact, of godly sorrow in contrast to worldly sorrow (2 Corinthians 7:10-13). In the end, no commitment by the church for vocational retraining can counter the deceitfulness of sin.

Logic of Love

Second, a policy of offering vocational retraining to a fallen minister implies fairness over love. The logic of fairness would say that since the man's vocational preparation was exclusively for the tasks of ministry—exegesis and homiletics, discipleship training and counseling—then he ought to be offered adequate preparation for a new career. It's only fair.

But the logic of love is different. It would say that this minister has fallen into a sin common to us all, but with uniquely devastating consequences. Love means considering his interests despite there being no official obligation to do so. Isn't this what Jesus illustrated by the story of a compassionate Samaritan and a pair of unwilling Jews? (Luke 10:29-37)

Love is best expressed personally, not contractually. The love that Christians should hold for one another will personally motivate them to help a fallen brother. Obviously, there is no guarantee of this love. And that's the point. Love is expressed not in contractual guarantee, but in the spontaneous overflow of covenant commitment. A church policy that offers tuition reimbursement for a fallen pastor to get an MBA is very different from a member of the church who owns a furniture business offering him gainful employment and training.

Even if the church did want to go the route of supplementing an MBA or some other training, it's best done through an unprompted act of benevolence, not from some prior agreement. This arrangement keeps the line clear between some inaccurate sense of employee fairness and a genuine act of undeserved generosity.

For those of us in ministry, we do well to plead with the Lord frequently to spare us from being that guy. We should beg Christ for the kind of love that motives our holiness far better than the fear of earthly consequences alone. But it should also be said that God is generous to sinners devastated by the consequences of sin. Psalm 38 stands as a testimony that God welcomes prayers for forgiveness for sin as well as help for the consequences we caused by it.

No fallen pastor who is a child of God disqualifies himself from his Father's promise to provide. A repentant pastor will learn this promise regardless in the end, and regardless of a church policy.

by Jeremy Pierre at April 11, 2014 05:01 AM

Embedded in Academia

Heartbleed and Static Analysis

Today in my Writing Solid Code class we went through some of the 151 defects that Coverity Scan reports for OpenSSL. I can’t link to these results but take my word for it that they are a pleasure to read — the interface clearly explains each flaw and the reasoning that leads up to it, even across multiple function calls. Some of the problems were slightly alarming but we didn’t see anything that looks like the next heartbleed. Unfortunately, Coverity did not find the heartbleed bug itself. (UPDATE: See this followup post.) This is puzzling; here’s a bit of speculation about what might be going on. There are basically two ways to find heartbleed using static analysis (here I’ll assume you’re familiar with the bug; if not, this post is useful). First, a taint analysis should be able to observe that two bytes from an untrusted source find their way into the length argument of a memcpy() call. This is clearly undesirable. The Coverity documentation indicates that it taints the buffer stored to by a read() system call (unfortunately you will need to login to Coverity Scan before you can see this). So why don’t we get a defect report? One guess is that since the data buffer is behind two pointer dereferences (the access is via s->s3->, the scanner’s alias analysis loses track of what is going on. Another guess is that two bytes of tainted data are not enough to trigger an alarm. Only someone familiar with the Coverity implementation can say for sure what is going on — the tool is highly sophisticated not just in its static analysis but also in its defect ranking system.

The other kind of static analysis that would find heartbleed is one that insists that the length argument to any memcpy() call does not exceed the size of either the source or destination buffer. Frama-C in value analysis mode is a tool that can do this. It is sound, meaning that it will not stop complaining until it can prove that certain defect classes are not present, and as such it requires far more handholding than does Coverity, which is designed to unsoundly analyze huge quantities of code. To use Frama-C, we would make sure that its own header files are included instead of the system headers. In one of those files we would find a model for memcpy():

/*@ requires \valid(((char*)dest)+(0..n - 1));
  @ requires \valid_read(((char*)src)+(0..n - 1));
  @ requires \separated(((char *)dest)+(0..n-1),((char *)src)+(0..n-1));
  @ assigns ((char*)dest)[0..n - 1] \from ((char*)src)[0..n-1];
  @ assigns \result \from dest;
  @ ensures memcmp((char*)dest,(char*)src,n) == 0;
  @ ensures \result == dest;
extern void *memcpy(void *restrict dest,
                    const void *restrict src, 
                    size_t n);

The comments are consumed by Frama-C. Basically they say that src and dest are pointers to valid storage of at least the required size, that the buffers do not overlap (recall that memcpy() has undefined behavior when called with overlapping regions), that it moves data in the proper direction, that the return value is dest, and that a subsequent memcmp() of the two regions will return zero.

The Frama-C value analyzer tracks an integer variable using an interval: a representation of the smallest and largest value that the integer could contain at some program point. Upon reaching the problematic memcpy() call in t1_lib.c, the value of payload is in the interval [0..65535]. This interval comes from the n2s() macro which turns two arbitrary-valued bytes from the client into an unsigned int:

#define n2s(c,s)        ((s=(((unsigned int)(c[0]))<< 8)| \
                            (((unsigned int)(c[1]))    )),c+=2)

The dest argument of this memcpy() turns out to be large enough. However, the source buffer is way too small. Frama-C would gripe about this and it would not shut up until the bug was really fixed.

How much effort would be required to push OpenSSL through Frama-C? I don’t know, but wouldn’t plan on getting this done in a week or three. Interestingly, a company spun off by Frama-C developers has recently used the tool to create a version of PolarSSL that they promise is immune to CWEs 119, 120, 121, 122, 123, 124, 125, 126, 127, 369, 415, 416, 457, 562, and 690. I think it would be reasonable for the open source security community to start thinking more seriously about what this kind of tool can do for us.


  • In a comment below, Masklinn states that OpenSSL’s custom allocators would defeat the detection of the too-large argument to memcpy(). This is indeed a danger. To avoid it, as part of applying Frama-C to OpenSSL, the custom malloc/free functions would be marked as being malloc-like using the “allocates” and “frees” keywords supported by ACSL 1.8. Coverity lets you do the same thing, and so would any competent analyzer for C. Custom allocators are regrettably common in oldish C code.
  • I’m interested to see what other tools have to say about heartbleed. If you have a link to results, please put it in a comment.
  • I re-ran Coverity after disabling OpenSSL’s custom freelist and also hacking CRYPTO_malloc() and friends to just directly call the obvious function from the malloc family. This caused Coverity to report 173 new defects: mostly use-after-free and resource leaks. Heartbleed wasn’t in the list, however, so I stand by my guess (above) that perhaps something related to indirection caused this defect to not be ranked highly enough to be reported.
  • HN has some discussion of PolarSSL and of this blog post. Also Reddit.

by regehr at April 11, 2014 04:58 AM

The American Conservative » Articles

John Wayne’s Lost Legacy

John Wayne’s gravestone in Corona del Mar, California, is marked with the following inscription: “Tomorrow is the most important thing in life. Comes into us at midnight very clean. It’s perfect when it arrives and it puts itself in our hands. It hopes we’ve learned something from yesterday.” It’s a lovely sentiment—poetic in its way, and more than a little unexpected coming from the gun-slinging “Duke” of the Hollywood western. But it’s tragic, too, because tomorrow has not been kind to John Wayne, if we take “tomorrow” not in the narrow and literal sense, meaning the day following the current one, but in the expansive and, yes, poetic sense, meaning all that comes after. In fact, by the time he spoke those words in a 1971 interview with Playboy, tomorrow had already decided it didn’t have much to learn from John Wayne.

The man born Marion Morrison in Winterset, Iowa, in 1907 had one of the most enviable careers in the history of film. But in terms of his profession—movie actor—and the influence his legacy has had on the industry he dominated for several decades in his prime, well, let’s just say that Wayne’s mid-century style of film acting is more than just obsolete; it bears no resemblance whatsoever to the craft as it is currently practiced. Nobody does it like that anymore (except maybe Clint Eastwood). Actors today trace their professional lineage not to Wayne, or highly regarded contemporaries such as Spencer Tracy or Jimmy Stewart, but to the brutal emotional realism of Marlon Brando and, through him, to the American “method” of acting developed by the Group Theatre in the early 1930s. Brando’s spellbinding performance as Stanley Kowalski in 1951’s ”A Streetcar Named Desire” touched off a revolution and made the film acting that came before it seem like a classic case of same genus, different species. Brando did to Wayne what the iPod did to the Walkman—both play music, but the similarities end there.

Buried deep within Scott Eyman’s large new biography, John Wayne: The Life and Legend, is an anecdote from the set of 1961’s “The Comancheros” that captures the artistic distance between Wayne’s lunch-bucket approach to acting and the method madness of Brando’s apostles. Michael Curtiz (of “Casablanca” fame) was the film’s credited director, but Wayne was forced to step in and direct numerous scenes when Curtiz, battling terminal cancer, became too weak to work. The film’s female lead, Broadway actress Ina Balin, was set to play a short scene with Wayne, who was eager to shoot it, print it, and move on. Balin wanted to explore her character’s motivations and asked for a rehearsal. Wayne consented, but whispered to cinematographer William Clothier to roll film as they “rehearsed.” When the scene was finished, Balin was shocked to hear Wayne bark, “Cut. Print. See how easy this is?” According to co-star Stuart Whitman, when Balin would “dig down and get emotional” Wayne would mumble under his breath, “Get the goddamn words out.”

Wayne’s general attitude toward the craft of acting—“get the goddamn words out”—is not dead. Many actors and directors work quickly. But no well-regarded modern, American actor focuses, as Wayne and his ilk did, exclusively on the external attributes of a character. “Wayne was a member in good standing of a pre-Method generation of actors,” writes Eyman, “whose general intent was, as James Cagney put it, ‘Look the other actor in the eye and tell the truth.’” Wayne was obsessed with being truthful on screen, even if that meant turning down offers from good directors of meaty parts that he felt he couldn’t authentically carry off. He employed a handful of writers whose job was to finesse his lines so that they sounded genuine coming out of his mouth. Wayne was equally committed to preserving his image. He was, Eyman writes, “emotionally committed to playing only John Wayne parts.” These were parts that looked like a man should look and acted like a man should act. When a still photographer on the set of “True Grit” snapped a few shots of the aging Wayne riding not on a horse but a on a specially outfitted mechanical saddle, the star ran him down and smashed his camera. “He was always going to be in the John Wayne business, always going to be protecting the franchise,” says Eyman.

Wayne couldn’t fathom why other stars weren’t as diligent has he was about “protecting the franchise.” At the 1957 premier of Vincente Minnelli’s “Lust for Life,” Wayne upbraided star Kirk Douglas for playing the part of Vincent van Gogh like a “weak queer.” “How can you play a part like that? There’s so few of us left. We got to play strong, tough characters,” said Wayne.

“It’s all make-believe, John,” a dumbfounded Douglas replied. “It isn’t real. You’re not really John Wayne, you know.”

If Eyman is to be believed, John Wayne was really John Wayne. The actor known for playing strong, tough characters was a strong, tough character in real life, too. He loved making westerns because it allowed him to spend his days outdoors doing vigorous, physical things. For recreation, he liked nothing more than fishing for giant salmon off the coast of Vancouver in his 136-foot yacht, Wild Goose, a decommissioned Navy minesweeper. He enjoyed the company of manly men, such as frequent co-star Ward Bond and screenwriter James Edward Grant, and was never happier than when playing a game of poker outside a tent on location in Mexico. There was always a bottle of tequila on the table.

This is a panoramic and entertaining book. It brings the fast-fading world of classic Hollywood into sharp focus. If it has one flaw, it’s in the author’s clear disdain for his subject’s conservative politics. Wayne was involved in the blacklisting of Hollywood communists, Eyman believes, through his long association with the Motion Picture Alliance for the Preservation of American Ideals, an organization dedicated to turning back the efforts of “Communist, Fascist, and other totalitarian-minded groups to pervert this powerful medium into an instrument for the dissemination of un-American ideas and beliefs.” Furthermore, Wayne supported the Vietnam War. Eyman quotes a letter sent by Wayne to President Lyndon Johnson asking for help coordinating the filming of “The Green Berets”: “Let’s make sure it is the kind of picture that will help our cause throughout the world.” A red-hunting supporter of America’s Southeast Asian military misadventures? Nothing could be more grotesque to the modern liberal mind. But while it’s understandable that Eyman doesn’t share Wayne’s conservatism—not everyone does—it’s off-putting that the author is so desperate to let the reader know it. This is the mark of a snob.

Wayne’s politics can be summed up very simply: he loved America and he hated communism; he loved liberty and he hated dependency; he thought the movies were the best thing going and he didn’t want them turned into a left-wing propaganda tool. Yet, for all Eyman’s antipathy, and for all Wayne’s deeply held convictions, the book makes it plain that the star was a straight-shooter who refused to hold a co-worker’s creed, color, politics, or sexuality against him. “For Wayne,” according to Eyman, “personality always trumped politics; if he liked you, he was willing to overlook your ideology.” One doubts that this spirit is much in evidence in today’s film industry. Hollywood still has a lot to learn from yesterday.

Matthew Hennessey is an associate editor of the Manhattan Institute’s City Journal.

by Matthew Hennessey at April 11, 2014 04:05 AM

The Brooks Review

Minimal ToDo

Walter Somerville on Begin:

It’s not a system that will appeal to everyone, but I have found it very helpful, if only to slow down and think through the day before I start working.

Please consider becoming a member to support my writing. All writing is 100% member funded.

by Ben Brooks at April 11, 2014 03:14 AM

The Tech Report - News

For Valve, is the Steam controller both a blessing and a curse?

As excited as we may be about Steam machines democratizing PC gaming in the living room, there's no question Valve's new platform will encounter some obstacles on its road to success. Valve will have to persuade other developers to support a new platform, for starters, and it will be forced to work with hardware makers, particularly AMD and Nvidia, to ensure SteamOS gets top-notch driver support. On top of that, Valve will need to convince PC gamers to step out of their comfort zone and embrace a wildly different operating system without support for many familiar apps and games.

At the Game Developers Conference last month, I became aware ...


April 11, 2014 03:06 AM

Englewood Christian Church: We Blog! » ERB

ERB Weekly Digest – April 11, 2014 – Willie Jennings, Festival of Faith and Writing.


Apologies for the dearth of posts recently… 
We hosted the Slow Church Conference last week, and are now at the Festival of Faith and Writing. (If you are here at the fest, please stop by our booth and say “Hello.”)


Interview about SLOW CHURCH that John Pattison and I did for Christianity Today’s PARSE blog.
DOWNLOAD a free sampler from the Slow Church book here.

Reviews, etc. posted this week on The Englewood Review of Books website:

  • Willie Jennings – Slow Church Conference [Audio]
    Here is the first of the audio recordings from the Slow Church Conference that we hosted last week here at Englewood Christian Church. Our aim for the conference was to foster conversation around the work of several key theologians whose work inspired the that John Pattison and I wrote. [ Download a FREE sampler of [...]

Did you know we have a quarterly print magazine?

Digest powered by RSS Digest

by csmith at April 11, 2014 02:43 AM

512 Pixels

The National Brain Tumor Society's 2013 in Review →


More than 4,000 children are diagnosed with a brain tumor each year and today it's the leading cause of cancer-related deaths in children under the age of 10. To drive progress in pediatrics, National Brain Tumor Society has committed to direct one-third of its research budget toward this area, including Project Impact.


by Stephen Hackett at April 11, 2014 02:28 AM

Planet Lisp

Ben Hyde: Lisp on the BeagleBone Black

Somebody left and orphan in a basket on my doorstep.   A BeagleBone Black.  Inspired by Patrick Stein’s assurances that it was easy to get CCL working on another tiny ARM board I gave it a try.  It was tedious, but not hard.  I’ve put my notes and some rought half-tested scripts on github.

April 11, 2014 02:05 AM

CrossFit 204

Workout: April 11, 2014

Jenn will be heading back to Vancouver for the Canada West Regional. Congrats!

Jenn will be heading back to Vancouver for the Canada West Regional. Congrats!

Back squat 5-5-3-3-3


Max muscle-ups in 2 minutes

Rest 3 minutes

Max handstand push-ups in 3 minutes

Rest 2 minutes

Max toes-to-bars in 4 minutes

Rest 1 minute

3 rounds of:

3 rope climbs from seated

10 burpees


Back squat 3 reps - Work to a heavy triple without missing.

3 rounds of:

10 front squats (225/165 lb., bar take from ground)

Row 500 meters

Rest as needed

Max muscle-ups in 2 minutes

Rest 3 minutes

Max handstand push-ups in 3 minutes

by Mike at April 11, 2014 01:50 AM

sacha chua :: living an awesome life

Emacs Chat: Tom Marble

Emacs Chat: Tom Marble – Invoicing with Org and LaTeX; Clojure

Guest: Tom Marble

Tom Marble’s doing this pretty nifty thing with Org Mode, time tracking, LaTeX, and invoice generation. Also, Clojure + Emacs, and other good things. Enjoy!

For the event page, you may click here.

Want just the audio? Get it from MP3

Check out Emacs Chat for more interviews like this. Got a story to tell about how you learned about or how you use Emacs? Get in touch!

The post Emacs Chat: Tom Marble appeared first on sacha chua :: living an awesome life.

by Sacha Chua at April 11, 2014 01:26 AM

The Tech Report - News

Thursday Night Shortbread

The Pick 6

  1. Reuters: 'Heartbleed' computer bug threat spreads to firewalls
    and beyond and Google to sell Glass to public next week
  2. DigiTimes: Intel to launch Haswell refresh
    CPUs and 9-series chipsets in early May
  3. VR-Zone: AMD launches 'Jaguar'-based AM1 desktop platform
  4. iFixit's Samsung Galaxy S5 teardown
  5. G4Games: Firefox OS 2.0 UI gets exposed, looks pretty
  6. Softpedia: Nvidia releases first Linux driver with overclock features


April 11, 2014 12:44 AM

Culture Digitally

Capture, Fixation and Conversation: How The Matrix Has You and Will Sell You, Part 3/3

FixationFixationThe fixing process is important to me theoretically because it’s a cross cutting term. Fixing suggests that capture is not just a matter of direct representation but of representation in a particular way.  So, in so much as platforms, or networks of platforms, capture and fix, they do so with a certain plan. In as much as they afford users the ability to do a number of things and they capture a host of content, fixing processes frame the content in particular ways. Formats, character limits, filters, etc. are the most evident elements of fixing but so are things such as the arrangements of content on a platform’s interface, the exclusion of content and the platform’s triangulation of its fixing processes with laws, norms and business goals. I like the work that’s being done to interrogate fixation by our colleagues around the globe. The work on interrogating normativity, neoliberal ideologies in digital architectures, big data, social practice and the politics of algorithms is an empirically diverse and theoretically rich sortie into the messy world of fixation.

Chinese 104-year-old twins Cao Xiaoqiao and her elder sister Daqiao look at each other in ZhuchengFor me there are some things that remain to be explored of the fixation process.  Estrangement from ourselves for one thing is not often discussed.  We, sometimes rightly, celebrate the opportunities social media afford us to represent ourselves, the selves we want to be, or to connect but often forget about the odd moments when we see that captured self, fixed in platform and then turning toward the screen, should say: “That is not it at all, That is not what I meant, at all.” Maybe we’ve all had that feeling when, for example, Facebook implemented the timeline feature.  It presented our history as a matter of data, a composite if tags (by us or of us), images, blurbs, etc.  Captured into the platform or surrendered there by us our history, so mediated, was therefore not necessarily authored by us, a species of unauthorized biography.  Of you history shall be written, but remember “history is always written by the winners.”  But this is the lesser of the issues.  Fixation on the platform’s back-end is what troubles most.  How are we configured by algorithms based on logics that have ultimately little to do with us as human beings.   How are we data?

Screen Shot 2014-04-10 at 1.13.31 PMDespite the dynamism of social media: of streams of text, constant updates, ever changing videos and images,  fixation ensures a certain suspension of action.  Related to what danah boyd calls “permanence,” but permanence may belie a certain kind of political economy and intent that wants to make capture a practice, not merely a consequence of a technical structure thrust upon us ex nihilo.  “Networked publics,” yes, but also network products and networked markets.  Fixation, in biology requires the use of chemical agents to create chemical bonds between amino acids across the proteins that are the composite of biological systems and thus stops in its tracks denaturation (decay and entropy) but also life.  Fixation is the literal pinning-down of dynamic organic life so that it may be examined.  Fixation through algorithm renders the social, the person or the group as object and process, harnessable for markets, for governments, for corporations.   As T.S.  Elliot once asked, “When I am pinned and wriggling on the wall, Then how should I begin. To spit out all the butt-ends of my days and ways?”

There is a tension in fixation:   Fixed slices of time and life matter as reducible data sets but on the other hand, the system must be plastic enough to accommodate new data sets like status updates on the status update on you.  System plasticity (a term used in neurobiology to describe dynamic neuronal architecture) in this case represents a conversation between the dynamic nature of human and the more bounded nature of data. A conversation between the evolving, the living, the dying and what we are as data in the network: a fixed point.  Ultimately, decisions made based on bounded data endure beyond the instant and impact already changed circumstances. A data isomorph Lewis Mumford would be frightened by I imagine. Herein then we may find room for resistance, in the lag between the fixed and the living where the cacophany of human noise, randomness and messiness will hopefully confound the platform’s algorithm.  And it will be left, at last, only with impressions.

Screen Shot 2014-04-10 at 1.15.23 PM

Conversion: Fixation sets up the context for conversion, the transformation of all the is known, all that is fixed into elements of exchange economies.  It creates the political-economy of participation; of creativity; and of the self as cultural product.  For designers of capture platforms, conversion is a technological problem as much as it is a social question.  My point here is that at the user level social processes fixed in social media afford an understanding  of “what is going on.”  They translate for users a reality of place, time, outcomes and possibilities. Capture platforms make concrete and argue for their version of meaning and value through design. For platform designers, the trick is Screen Shot 2014-04-10 at 1.18.51 PMconveying value in a way that is both socially perceived and procedurally acceptable by a user base.  For users, technical solutions provided by platforms take the shape of feedback systems. Communication features that allow users to respond to one another’s participation. Not simply to say something back, mind you, but to perform a narrative by deploying a literacy in images, texts and sound creatively assembled on spaces laid out by architecture.  Extant research has taught us a lot about the social dynamics afforded within these systems.   Economies of popularity, of tribe, of identity, are all there at play like the flying parts of a mad Corliss engine.   But feedback systems do more then ensure social process, they reify it.

Laocoon_Pio-Clementino_Inv1059-1064-1067_n6To reify:  To make real or concrete that which was abstract. In their reification function, feedback systems make visible and record the social. They literally allow users to see strife, reciprocity, attention economies, popularity, etc.  Not only that but, the reification function creates the architecture by which value can be derived from those processes.  It becomes the material proof of what may be the case outside the platform, thus adding to already existing social capital, status, meaning or popularity.  Conversely, they may create a platform-based referent that validates perceptions of value external to the platform.  To make this point more concrete think of something as everyday as popularity and the person who has it, at school or at work.  If he or she is able to import it into the platform, reifying it through a feedback system, then it’s possible that it may increase within the platform but also outside it.  For someone without much popularity, who then is able to build it within platform, creating what Alice Marwick and others might call “micro celebrity,” it may well translated to external consequences.  The platform in that case, serves as a form of social proof.  But it can also serve as an amplifier of already existing conditions.

Conversion through feedback also creates an architecture for reciprocity, trust, performance, participation, alliances, etc.   Conversion and its methods are technological solutions to the accounting of social dynamics and their performance.  By creating ways in which users can see, read, hear and feel the social and by creating architectures by which those inputs can be responded to, they convert participation into value, situated value for the user and ad revenue for the platform owners.

Slide1Slide2What you see in the image above is YouTube justice, an example of reification from my research that illustrates a social, community driven and understood value.  The user [12awinstinct] was accused of stealing content from another user [iFlyILLINI]. The community through a review of evidence (videos and supporting commentary from other users) found him guilty and punished him with a loss of subscribers, a potentially costly tax on social and monetary capital if ad revenue was involved.   Conversely the community rewarded the offended user with an explosion of subscribers.   Perhaps most importantly, the process was logged in real time by systems inside the platform but also by 3rd party aggregators like one which tracks channel popularity on YouTube.  The vested elements of the YouTube community saw justice done, YouTube justice.

Slide1By now it should come as no surprise that value “lives” on multiple levels in a platform.  So in my example the platform allowed YouTube users to derive situated value from the conversion of subscriber numbers into justice or retribution. On another level, platform owners will need to do some conversion of their own, albeit primarily in terms of revenue.  For  social network businesses bent on the “social” or “the socially created” as their central value driver, inventory (you, me and our interactions and creations) must be parsed by mechanisms that allow us and ours to be sold.  These mechanisms usually take the shape of systems that analyze and repackage.  Typically these are back-end structures, hidden from user view.  Ultimately however, the market value of a social media web business hinges on the algorithms that analyze and repackage both content and creators in ways that can be traded.  Facebook’s less than stellar IPO, for example, was not only the result of glitches in the NASDAQ’s trading system but also perhaps due to a general uncertainty about how exactly Facebook was going to convert all those users and their participation (i.g. their likes, links, comments, pictures, personal information, and who knows what else) into profit.


Generating value for a social media business is both a technical and social problem.  Technically the “right” algorithms must be designed to execute analysis and aggregation that will suit costumers.  As a social problem markets must be created and sustained, labor forces kept working and incentivized, people must see the situated value of their products through conversion so that they keep doing it, so that the platform can do its other forms of conversion.  Yes, as Tarleton Gillespie would say, algorithms have a socially contingent nature that impacts the way in which they “speak” about what is valuable. If thinking about the values embedded in algorithms allows us to see their politics, it may also benefit us to think of their economics not only as a value but as a logic for understanding what is the case.


What an algorithm says about a group, individual, or any bit of information on a platform is concerned with value (both in the long and short term).  The algorithm may suggest a “trend” as Twitter does but as it constructs a trend it also builds value.  Trending, as a measurable process, is an exercise in some form of measurement (public opinion, popularity, rate of increasing or decreasing interest) but it’s also a stab at creating value.   It’s an attempt to convert all that “chatter” into something that can be sold, to investors, to other media businesses, and to users. The interesting thing is that the more one considers what an algorithm is telling us, the more one realizes that, for some platforms, “trending” of any sort (on twitter or YouTube) is a sure bet.

This is not to suggest that outputs are necessarily biased toward what amounts to the most profitable analysis or aggregation but rather that the process of analysis and aggregation and its claims to truthfulness about whatever information was processed is done in the first place because it is perceived to be valuable.  No one would have asked is it trending? Unless some one else had first asked, “can we sell trending?”   The value placed on the outcome of an algorithm is as much based on its “accuracy” as it is on 1) having users see it as accurate and 2) when users are inventory in that platform, having them handed over as seers of information of a particular stripe.

Slide2Because economic value is part of the equation, our social processes, our identities and our membership in family, work, etc. become functional artifacts, means to ends for the platform and their owners.  As such we become mass culture and the platform the production apparatus.   Our profiles are made by us but with tools not of our making.  Our Google search results may be crowd-made but made in the image reflected in platform and its  triangulations of what counts as valuable, useful and interesting.  The skepticism of science and technology studies that questions, at its core, any socially situated technical or scientific system that claims to know the case, should be brought to bear on such artifice.  Power lurks in epistemology.  I have to say algorithm and platform are more than political or, as they like to tells us, stewards of  public discourse.   They are epistemology.

Screen Shot 2014-04-10 at 7.58.55 PMTo the extent that social media capture (recruit, fix and convert) they do so in a manner that reproduces and reinforces a type of cultural production, a particular type of self (networked as Julie Cohen would say), configured by fixation for conversion.  The broader point is that once in place, that system continues to reinforce those arrangements, begging them of their users and leaking those practices to  life worlds outside the realm of media.  At such points social processes removed from social media are confronted with arrangements born in the ether.  We assume connectivity and feel displaced without it. We are habitually never alone, we are the “searching for network” function and have become accustomed to being reified and captured.  Say what you may, the structure of how we are socially has changed in many respects since we started liking, status updating, taking selfies and tweeting.   We may have made some facets of social media our own, through adaptation and appropriation of features and alternative uses, but capture/conversion systems care little of that.  It’s all part of the bet, that, in the processes of digital cultural production, through capture everyone and anyone can be pinned and sold to somebody.


Acknowledgements:  This series of posts is based on a lecture given at  the Colloquium for MIT’s Comparative Media Studies Program in the fall of 2012.  My thanks to MIT’s CMS for inviting me and for asking me to think on this topic.  Of course thank you TS Elliot and the Wachowskis :-)   I would also like to thank all of the contributors and participants in Culture Digitally. Friends, all.   Without their writing, conversations and joking around over beer or a nice cup of coffee at a conference or workshop my thinking on these matters would not be possible.  If you are not listed on the blog but we have spoken and shared a laugh and an idea, you are in here too.

by Hector Postigo at April 11, 2014 12:27 AM


Do You Pronate?: A Shoe Fitting Tale

Hoka PronationLast week I stopped by my local Dick’s Sporting Goods store one evening to pick up a pair of shoes that had just been released (yes, I know, that was my first mistake). It seemed that there was only one person working the shoe department, so I was sitting on a bench waiting for her to finish helping another customer who had arrived just before me. Listening to their conversation made me cringe.

When asked if she needed help, the woman told the clerk that she was looking for a pair of sneakers similar to the ones she was currently wearing. They were some model of Saucony, I think she called them Sonomas or something like that, so she clearly wasn’t a shoe geek. She just wanted a pair of comfortable shoes that weren’t too brightly colored.

The clerk’s next question for her was “Do you pronate?” It was all I could do to not intervene. The clerk told her that “If you pronate, you should get one of the shoes labeled stability.” The customer clearly had no clue what the clerk was asking her, and she was looking at her like she had three heads. The clerk attempted to clarify by saying something along the lines of “Do your ankles roll in?” Again confusion. I bit my tongue. No further attempt to explain, no attempt to assess. The clerk walked to the shoe wall and picked a few suggestions.

I always struggle in situations like this because I think a lot about this topic, and it was a perfect moment to help someone out and maybe educate a bit in the process. But at the same time, I’m not a confrontational sort (at all) and you never know how someone will react when you call them out [I once tried talking to the owner of a local shoe store - not a run specialty store - about his fitting process (they have a fancy arch scanner machine). I don’t think he appreciated my thoughts.]. It was pretty clear that the woman was not going to be running in the shoes based on her comments when trying them on, so I opted to keep my mouth shut.

This entire exchange reinforced for me why I hate the whole pronation model of fitting shoes. First, the question “Do you pronate?” revealed that the clerk didn’t really understand what pronation is, and was probably just repeating something she had been told to ask by a manager, brand, or store fitting procedure. The reality is that everybody pronates, and pronation is a completely normal movement. (I should note that the term pronation as used colloquially is typically equivalent to rearfoot eversion, the actual movement is a bit more complex) We might vary in how much we pronate, but asking someone if they pronate is like asking them if they breathe. I’d actually be much more concerned if the customer had revealed that no, she doesn’t pronate. At all. That would be worrisome.

The second thing that bugged me is that the customer had no clue what this clerk was talking about. So we have someone who doesn’t understand what pronation is asking a question about pronation of someone who seems to have never heard the word used before.

Third, asking someone if they pronate and expecting an accurate response is kind of silly. If you’re going to go so far as to use the level of pronation to help fit a shoe, shouldn’t you at least try to assess it somehow? Did she expect the customer to say “Why yes, I have 10 degrees of eversion.”

And finally, even if there was an accurate assessment of how much this customer pronated, I still have yet to see strong evidence saying A) how much pronation is too much for a given individual, B) that any given shoe is effective at controlling pronation when you look at the actual movement of the foot inside the shoe (and there are no data I’m aware of showing the relative pronation-controlling effectiveness of the various shoes on the market), or C) that fitting a shoe based on amount of pronation is warranted or effective from an injury prevention standpoint. I like this statement on the topic in the newly released guidelines on selecting running shoes from the American College of Sports Medicine (an interesting document that I should write about – I have mixed feelings about it on the whole):

“Be aware that all runners pronate, or drop the foot inward. Pronation is a normal foot motion during walking and running. Pronation alone should not be a reason to select a running shoe. Runners may be told while shopping that because pronation is occurring, a shoe with arch support is best. In fact, the opposite may be true. Pronation should occur and is a natural shock absorber. Stopping pronation with materials in the shoes may actually cause foot or knee problems to develop.”

For another example, Dr. Benno Nigg of the University of Calgary, one of the most widely respected experts on running shoes and biomechanics, had this to say in his 2010 book Biomechanics of Sports Shoes:

“…the perceived dangers of overpronating and the expectation of resulting injuries resulted in technologies (e.g., dual density midsoles and orthotics) being developed to decrease both the maximum pronation as well as the time to maximum pronation. These products were (wrongly) assumed to be methods for the treatment and prevention of pathologies such as plantar fasciitis, tibial stress fractures, and patella-femoral pain syndrome. Evidence for the effectiveness of such strategies is currently unavailable. It is speculated that there is no such evidence because “overpronation,” as it occurs in typical runners, is not a critical predisposition for injuries.”

“Pronation and supination have long been the “danger variables” hanging over the sport shoe community, but their time as the most important aspects of sport shoe construction is over. Pronation is a natural movement of the foot and “excessive pronation” is a very rare phenomenon. Shoe developers, shoe stores, and medical centres should not be too concerned about “pronation” and “overpronation.”

(For more along these lines, read this great article on pronation by Ian Griffiths.)

This pronation business frustrates me on a personal level because I was “diagnosed” as an overpronator when I first started running (and at the time I didn’t have the knowledge to know better). My diagnostic procedure involved me being eyeballed while trotting about 10 feet across the floor of a store. I’ve since come to realize that I do pronate a fair amount, but I seem to do better when I let the movement happen.

I’m also frustrated because I often see clients in the clinic who tell me they were sold a specific shoe because they are “overpronators.” Often these clients have very average pronation when considered in the context of the range of pronation that I have observed. Many of them are scared to try something else because they were told that they are an “overpronator” and thus need added support/stability. That’s what bothers me most. Once you get pigeonholed as a dreaded “overpronator,” fear of injury dictates your future shoe options (Don’t you dare try a neutral shoe!). It prevents you from trying something different that might be  a better match for your stride.

Coming back to the store clerk, I don’t blame her in this situation. She was probably just doing what she was told to do by management. I blame store management, or whoever helped them develop their fitting protocols. I blame brands that continue to promote this method of fitting shoes on their websites. I don’t think pronation should be the first thing asked about when fitting shoes. And if you are going to ask about it, at least explain to your employees what it is, when it might be relevant to an injury history, and train them how to assess it in a somewhat meaningful way (e.g., while someone is actually running, recognizing that there are limitations even here when you can’t see what the foot is doing inside the shoe). If you can’t do these things, don’t even bother bringing it into the fitting process. 

Are there times when attempting to control pronation with a shoe is warranted? Sure, absolutely. But assigning a shoe based on asking someone “Do you pronate?” is just plain silly.

For more, you can read a free chapter on pronation from my book, Tread Lightly. Access the chapter here.

by Peter Larson at April 11, 2014 12:02 AM