I was directed to an article from a Texas news station (thanks Nathan!):
According to the story, the Bible is, by law, being taught in public schools in Texas. The reaction I saw to the article was one of sharp disgust and upset. I jumped into the conversation with both feet; suddenly, a question occurred to me that demands to be asked:
What does that mean to you? If you heard that public schools in your area are "teaching the Bible", what would you assume that means? If I were to attend that school, what should I expect?
Defining the terms in a conversation is key to a productive debate... What's your definition?