I am interested to know what my fellow Europeans feel about the demise of Christianity within European society and the effects that it has had. Growing up in the 1980's, my school was Church of England and we were taught from the Bible and sung hymns and worshiped God in assembly. Having young nieces and nephews, it is quite clear that this has all been removed from schools. I personally feel it has had a detrimental effect and there is clear link between Christianity being driven out and questionable morals increasing in society. Am I the only one who thinks this?