My research into ADHD's differences in cognitive styles led me into the realm of learning a lot of about intuition and instinctiveness - but for all people. What I learned from that is that we, as humans, suffer from a lack of humility around what we know vs what we don't. It's essentially the Dunning-Kruger effect, but applied to EVERYTHING. Whether it's vaccines, Brexit, wars, etc... most people will form strong opinions regardless of how little they know. People become armchair experts in infectious diseases, immunology, macro-economics, statisticians, pretty much every industry and scientific field. Much of it comes to to Kahneman's 'Thinking Fast and Slow', where people will read or hear a headline, and develop an instant opinion with just a fraction of the information (one that fits with their other preferred narratives), that provides an anchoring bias. Next, they will filter further sources and information with confirmation biases. They will also tie their identities to these opinions, so it becomes about ego - people generally think they're a lot smarter than they are, and don't like to concede. They will block or ignore inconvenient facts or people delivering them - they don't want to be 'wrong', because then they have to accept to themselves and show to others they're not as smart as they thought. As I said at the start, it's a lack of humility... something that does change for some (emphasis on 'some'), with age we figure out how little we know about things, we start avoiding forming strong opinions, and finding humility.
A flawed sense of reason often develops through constant exposure to distorted or unsatisfying information, particularly when trusted parties convey it. It may start during the child years when critical thinking is not nurtured or when an individual is trained to accept the authority without challenging it. Over time, cognitive biases such as confirmation bias strengthen specific patterns and predispose people to avoid considering other viewpoints objectively. Extreme ideology has the potential to alter an individual's moral orientation by fostering an in-group vs. out-group mentality. Once one bases their identity and sense of social belonging on those beliefs, a bias will follow, judging new information through that filter, regardless of how much it fails to make sense on the outside. It can change, most often needs to change at a moderate speed, and with a trusting strategy — seeking common ground, asking unscripted questions, and showing the individual modest, yet accepted examples that puncture their assumptions. It takes time, and civil talk seems to open more doors than direct argument.