Source: The Independent, LSE
The Independent have published a ‘pro left wing’ article, discussing how ”the centre ground of British politics could be further to the left than generally thought, according to new research.”
They further this to say, ”A study into the responses of a cross-section of voters who declared a strong allegiance to the Conservatives, Labour, Greens, Liberal Democrats, SNP and UKIP, suggested that many are more left-wing than they believe themselves to be.”
What’s your view of this?
Is this just a way of trying to promote liberal politics? Do you believe that the UK is starting to become more left wing now? Where do you stand politically and why?
Comment Below your views