Isabella wrote:
I can't say for certain Maddog but I am not certain that it is Christianity which as changed. Wasn't Christianity considered a religion of women and slaves back in Roman times? Then Rome became Christian and all the Roman patriarchal structures were integrated into the early Church. Now that those connections are long gone and all that is left is the religion, it has gone back into the hands of the women?
First, Rome did not become Christian. Rome adopted some of the veneer of Christianity, but kept all the pagan ideas like monogamy as the only allowable form of marriage, the various holidays such as Saturnalia (which became Christmas) and Ishtar (which became Easter), and asceticism, a practice which had started creeping into and corrupting Christianity even before Constantine.
Second, Rome was not really patriarchal. At least, not the way the Bible teaches patriarchy. Rome and Greece were the only two ancient major cultures to practice socially-imposed monogamy as the only allowable form of marriage, and that un-Biblical marriage custom, which, like asceticism, grew out of the Greek worship of the male form (think "Olympic games"), was adopted by the Roman church and labeled as "Christian" without any Scriptural justification whatsoever - in fact, in direct opposition to what the Bible teaches, as is shown in literally hundreds of posts by real Bible scholars on this forum.
Roman culture was extremely repressive, harsh, and barbaric. Who was it who "perfected" the art of torturing people to death? Medical doctors who have studied crucifixion say that it is the most painful way to kill a person that has ever been devised. Human life, especially the lives of slaves, women, and children, was not considered to be worth anything.
But true Christianity values all human life, not just the lives of the elite and powerful. So, naturally, women and slaves would be attracted to such teaching more so than those who have power, such as the Jewish religious leaders and the Roman aristocrats of Jesus' day. In fact, several places in the Gospels tell us that many women followed Jesus, and that it was the women, not the male disciples (other than John), who were present at the Crucifixion. Most of Jesus' male followers were hiding in fear following the arrest of Jesus.
IMHO, and I am speaking from bitter experience here, those who apparently become born-again Christians but later turn their backs on Christ, have an improper understanding of what it means to be a Christian. That is initially the fault of the Church, teaching false doctrine such as that which I grew up under. (Was I truly born again? I believe so, but did not know what it really meant to be a Christian until 30 years later, after spending nearly 20 of those years trying to run from God.)
Of course, the Bible says that no man is without excuse, and so those of us who were taught false doctrine
absolutely must study the Bible for ourselves and discover God's Truth, as I did after becoming an adult. (And after I quit trying to run from God.) And those who may have been fortunate enough to have been taught true Christianity by whatever church they were part of must also study God's Word to confirm that they really were taught Truth.
So, Isabella, you are correct to say that Christianity has not changed.
What has changed is that which is falsely called Christianity. God has always had at least a remnant of His people who know, believe, and live by His Truth regardless of the cost, and regardless of what false Christians might teach as being "christian."
Being a true Christian is not for the faint of heart. It will cost you something. Many of us on this forum have paid a price because we believe what the Bible teaches about marriage and will not compromise what we know to be Biblical Truth.