In episode #4 of The Futurists, Regina Joseph talks us through the mechanics of the methodology known as Superforecasting. From her work with Pytho, Sibylink, the National Science Foundation, the Intelligence Advanced Research Projects Activity (IARPA) and beyond, we learn how forecasting extends well into the future through human collective intelligence techniques and statistical analysis. What does it take to see the trends of the future emerging as the world seems chaotic, disruptive and unpredictable?
Analysis complete. No addtional information is required for context. Proceed with transcript display ...
[Music] this week on the futurists you know we're entering into a worldwhere we have to hybridize our decision making with machine intelligence and sowhen we think about a human intelligence whether that's at the individual level or at thecollective human intelligence level where are the advantages you know that we have on the human sidethat can actually outperform machine intelligence because i think that is a monolithic idea that people generallyhave that machine intelligence will always outperform human intelligence that it is necessarily better well it isgood to identify where we add value in the future system right exactly and so but you can't do thatunless you measure it you simply cannot do that unless you establish certain benchmarks[Music] hello and welcome to the futurists wherewe're interested in talking to the people who anticipate influence and invent the futurei'm rob terczyk and i'm brett king welcome to our podcastWho is Regina Joseph Super Forecastertoday our guest is a polymath someone who's skilled in a number of different fields but fieldsthat are really relevant to this topic of forecasting and anticipating the future our guest today is regina josephshe is a super forecaster and we'll get into exactly what that means in just a minute she's also a cognitive scienceresearcher and she's launched a number of interesting projects so in this show we'll talk a little bit about how shegot into super forecasting and then later we'll get into some of the new things that she's working on because they're super relevant to this idea offuturists well welcome to the show regina thank you so much for having me it's great tosee you guys and uh thank you it's a great pleasure to be here okay so let's start off with the question everybody's probably wonderingabout what the heck is a super forecaster yeah you have to wear a capei know we get that all the time uh and i do have a cape somewhere but probably not the ones that most people have inWhere did the term Super Forecasting come fromtheir heads but uh uh so the the term super forecaster actually comes from auh research program that began in 2011 and ran for four years until 2015and it was funded by the intelligence community in the united states um uh iarpa which stands for the intelligenceadvanced research projects activity uh which is basically it's the research anddevelopment arm of the intelligence community the office of the director of national intelligence in the united statesand they run high risk high payoff experiments that are basically designed toavoid surprise um and so many people are familiar with darpa darpa is the sort ofanalogous division for the department of defense iarpa is for the intelligencecommunity and so they they both uh you know darpa was really behind the development of what we now know as theinternet so the kinds of things that most people uh trickle down into most people's livesare developed maybe 10 sometimes 15 20 years before through these kinds of r dagencies they they put a lot of money in helping to develop experimental researchand that often uh you know they they are the future right they're they're in the business of funding what problem wasayarpa trying to solve when they set up the futurecasters program so uh at the time of iraq's creation youknow this was just after uh this was in around 2006 but you know there was a lot of concern that uh uh the type of uhglobal conflicts that the united states was experiencing the types of global outcomes uh that people around the worldwere witnessing uh didn't seem to be affected by the intelligence andinformation that was coming to experts like people working in the intelligence community and so why weren't we able todetect 911 why weren't we able to detect the fall of the berlin wall why weren't we able to do something about evenWhy weren't the intelligence community able to foresee 911recently like the evacuation afghanistan there's always a question of whether or not the experts that are deployed tomake predictions or forecasts about outcomes in the future whether or not those are the best peoplethat you have so our invited a group of several groups of teams right to tostart to predict do forecasts and predictions but one team did particularly well as i recall that's right and a lot of the inspiration forWho is Prof Philip Tetlock Author of expert political judgmentthat research from the beginning came from a book written by philip tetlock who's a professor at the university ofpennsylvania he wrote a book called expert political judgment and basically what that bookwas was a a sort of a serial examination over many years of millions of forecastsmade by pundits experts people who have the sort of public legitimacy andauthority to claim expertise over a particular area of of interest and sowhat he what he did in that book was examine is there a correlation betweenexpertise and accuracy in prediction especially around aerial politics andeconomics and so that book came out and uh in 2006and so ayarpa was interested in thinking you know maybe we should test that out and so they build like darpa does theybuild their uh uh experimental programs uh around competitions where people areinvited to submit proposals to a question that is postulated by the by byirfa and darpa and so teams are invited to make these proposals and then there'sa process of selection where maybe three or four teams will get to compete against each other phil tedlock had hisWhat is the good Judgement Projectown team called the good judgment project and that was not only one of the originalfour teams that was part of the ace program when it began in 2011 but it also became the winning team and i was amember he proved his thesis yes and and and then some and uh youknow because before this there i think there was a certain amount of skepticism uh uh both uh in the public sector andWhy can experts be bad at predicting the futurethe private sector as to whether or not well you know certainly uh some expertsare very good at at prediction uh but it turns out that actually people are oftensurprised uh by the lack of direct correlation incertain cases between that type of expert credentialing and the accuracy of uh in fact i thinktelog's team wasn't even comprised of so-called experts it was comprised of people who had better thinking habitsthat he had you know he had kind of attracted and then sifted through to find the people who had the bestthinking skills and you know quite a lot about that can you tell us about the attributes of somebody who's considereda super forecaster what makes them a super forecaster sure um well uh uh phil and and and andhis wife and and the co-principal and one of the co-principal investigators of that experiment barb miller's you knowWhat makes a good futurists or Super Forecasterwhat they were looking at was identifying some of the psychometrics or the sort of uh traits of what is isrelated to the propensity for being good at forecasting and i think to refine i mean i think a lot of the focus was wellyou know the people in the good judgment project phil and barb's team uh were not expertsthat's not entirely true some of us actually do this for a living i was already working at a think tank doingexactly this was building out a futures division for a think tank to make geopolitical predictions right an expertin futurism but you weren't necessarily an expert in defense or intelligence or foreign policythe largest foreign policy think one of the largest foreign policy think tanks in europe so so so that was exactly whati was doing and had had had already credentials around that so and there were also a few other super forecasterswho did uh similarly have that but in general uh uh many of the people whowere involved in the experiment and who became identified as super forecasterswere people who were not working in the world of making predictions about geopolitical situations and economicsituations but what was interesting was you know the commonality behind the traits thatwe share so things like most of us are kind of numbers nerds uhwe did very well on uh the types of psychometric uh um assessments likenumber series berlin numeracy these are tests that measure how good are you at detecting patternsand numbers how good are you at being able to assign numeric probabilities andmake numeric estimations about things rather than using words and that alsospeaks to a long-standing concern within the intelligence community about thequality of making a prediction you know it's one thing to say whether or not youthink that an outcome x is likely or unlikely but there's a lot of wiggle room uhbetween what likely means to brett versus what likely means to robertand so uh and so some of the side experimentation that was occurring during the ace programperiod uh which was done by phil and barb and others involved on the team waslooking at this idea of quantifying what those words mean andwhen we do when when you do that you realize there's so much ambiguity in terms ofwhat that means that you're actually leaving something like 20 percent a 20 difference in accuracy on the tableif you're using words rather than numbers and so that that was a big part of understanding how do we communicatethis what kind of process are we using to enable better uh predictive assessmentso regina you you've mentioned psychometrics um obviously you mentioned you you're talking about the statisticalside of things but often this you know when we're trying to predict the future or look at howour systems are going to respond to that we're either looking at individual human behavior or collective human behavior soUnderstanding human behaviour when making predictionshow much of the art of this is understanding the behavior of humans andhistorical precedents that can impose themselves on aforecasting model that's a really great question because i think that leads to a lot of thecurrent work that we are dealing with now you know sort of years after the end of the ace program which is about youHybrid decision making withs human and artificial intelligenceknow we're entering into a world where we have to hybridize our decision making with machine intelligence and sowhen we think about a human intelligence whether that's at the individual level or at the collective human intelligencelevel where are the advantages you know that we have on the human sidethat can actually outperform machine intelligence because i think that is aa monolithic idea that people generally have that machine intelligence willalways outperform human intelligence that it is necessarily better well it is good to identify where we add value inthe future system right exactly and so but you can't do that unless you measure it you simply cannotdo that unless you establish certain benchmarks and so going back to your point one of the things that i've beeninvolved in and that i'm doing in my current research is is looking at identifying what danielkahneman who won the nobel prize in decision theory and which also is one ofthe cornerstone foundational sets of ideas that powered the ace program the the iarpa researchprogram and a lot of the anticipatory intelligence concepts that have been coming from it is this idea ofwhen you are trying to make a decision about something most people are going to make a gut decision most people are not goingto take a step back and think yes but what are the statistical realities aboutuh uh whether something is going to happen or not happen right most people default to their personal they make agut decision and then they look for evidence to support their decisions so they're going with confirmation bias right correct but what daniel kahnemanwas looking at was this idea um you know which is known as outside thinking at the statistical level you know whatThe importance of historical base rates for forecastingwhat we're looking you know we would call that the base rate you know what is the historicaluh rate of occurrence of an event that you could use to uh build a more accurate predictionabout things that have yet to happen so so so being able to identify a base rateis one of the keystones of what i work on in my research uh um and also whichis what we've seen in in our work especially in terms of the work that i do in establishing a systemic processthe base rate is everything how you find it how you present it to the end userhow you can adjust away from it so all of those are absolutely critical factorsin sharpening the accuracy of a of a forecast or a prediction so give us a for instance because this is pretty highlevel so is there uh what's an example of a base rate that you know so let's say we wanted to forecast a scenarioabout um tech adoption you know maybe maybe uh vr and ar and xr and thesethese new technologies that are coming so we could formulate all kinds of predictions but they would be based on anecdotal stuff uh you know reports inthe press what we've gathered maybe some numbers about you know headsets that have been sold that's not as you're saying that'sthat's sort of a trap right because we're likely to make an inaccurate forecast based on on just kind of this randomly assembled data you're proposinga base rate so what would the base rate be for something like that how would we go about constructing a better model for making a forecastwell um you know i'm so so vr is a great example i mean i i've been doing work invr as as early as the early 90s you know so sothis is something that has been around and has been in development for many decades and so and yetright before ces the consumer electronics show every year it's one of the biggest uh uh uh one of the biggest tech showsin the world annually uh there are always predictive roundups right everynewspaper has them you know they ask a bunch of experts and pundits you know what do you think is going to be the hotthing at ces what do you think because ces is held in the spring it's usually and it's a bunch of people spitballingideas about whatever they think is hot let's get real those end of the year forecasts about technology areour bs and so so and and and for for a period about 10 years i was being askedyear after year uh from like around 2013 up until even like a few years ago iwould get a call from newspapers like the guardian or whatever saying what do you think you know everybody says vr is going to be really hot and isaid this is exactly what everybody's been saying every single yearand so if it wasn't hot then and you still haven't sold through you know a a sort of minimumviable level of uptake in american households at a certain price point level you know you're always going to bein this setup where yeah vr is going to be the next big thing okay but hang on hang on a sec i'm sorry to bust in herebut hang on a sec so if someone were to say well okay vr it's one of those industries where the sunnever really rises so my prediction for next year is vr is also going to be disappointing um they would have beenright for the last five or six years because there's been a lot of hype since 2017 every year it's like oh this is a year vr and that doesn't actually happenbut they'd be wrong eventually right it might be this year that they'd be wrong because facebook's going to sell 10 mil 10 million units of the of the oculusquest uh two which is a pretty good headset it's really the first decent headset right 10 million is a prettyimportant benchmark right that's sort of like you're talking about sony playstation level you know in the first version of the playstation came out thatwas a big turning point the game console business so narrowly defined you could say well things are starting to trenddifferently now okay let's get back to the base rate so what's our base rate in that scenario where we're trying to guess exactly next yearWhy temporal scope is critical to forecastingso what you're going to be looking for a couple of different things first of all is the temporal scope problem this isone of the biggest issues that people have in being in getting good at forecasting is beingable to actually determine the time period of when a future event will occuruh typically pundits when they make forecast they'll say well there's going to be a recession yeah eventually but ifyou're making it on year x and by year x nothing has happenedyou you weren't very sensitive to the temporal scope of your prediction right if it happens 15 years later 20 yearslater well then it doesn't really matter um you know so so temporal scope is is abig problem so part of the secret in being able to extract better accuracy has a lot to do with not just about theforecast it's about the questions that you ask and so that is one of the areas that we specialize in too is this ideaof question generation asking the right questions is a form ofmeta forecasting you have to be able to formulate a question in a way that it'sstill going to matter a year from now two years from now three years from now it should make the forecastsit should allow the forecast to remain relevant uh so so if somebody says yeahvr is going to be a big hit this christmas okay i mean robert you just postsomething which could be molded into a tentable question so bychristmas of 2021 will facebook sell 10 10 million vr headsets that's a greatthat's a great question to pause it now the answer becomes what's your forecastand this is where you need to start to integrate the idea of the base rate is okay let me think about that for amoment how many vr headsets got sold last year how many vr's headsets got sold the yearbefore that let me look at it we so so at pithom i come to hear my research partner and ipavlet tenacio we we talk about what we call the the base rate rule of ten righttake ten incremental units of uh prior historywhereby you can establish some kind of a pattern the answer for 2020 is 5.5 million unitsright so there you go so uh so so if we start to assemble that well if facebookis projecting 10 million you know and we're looking at numbers uh look perhaps quite different fromthat projection right then your job as a forecaster is starting to makegranular adjustments between what are previous rates of occurrencewhat is a say for example an estimate that somebody makes about a futureoutcome and uh what are all the other factual uh bits of information what are all thevariables and parameters that you need to take into account to refine thatgranular adjustment you're sort of let's say you get a base rate we have x number of vr headsets sold at this point intime as of the time of this question okay that's a that's a good data pointto have but then if you're going to try to refine that judgment what you want to think about is okay well what are theprice points of these headsets what is the ratio of the price point of the headsets the actual sell-through of theheadset and how often has that happened every single time a new headset gets uh uh gets released onto the into themarketplace so game consoles and smartphones could be a proxy there forseeing what the price point is i would be looking at that data as well i would be looking at comparative rates of uptake you know in certain types ofdevices and gear that have some kind of similar function entertainment knowledgedevelopment you know look at the rates of adoption for those products and lookfor similarities are they made by the same manufacturers uh uh where is the shelter in different regions in theworld right so so you have to apply you know a very specific process ofForecasting starts with systemic inductive reasoningsystemic reasoning right you you have to start first with uh basically you youwant to develop some inductive reasoning about the problem right you're you're trying to think through what are all ofthe elements that i need to sort of put into the mix in order to understand this problem and to generate a reasonableresponse so in terms of regina in terms of that base rate um you know likehow much of that involves you actually learning about the domain to successfully come up with that base rateversus using analogies from other forecasts that you've had again a great question because that goesright to the heart of the idea of expertise and we are dealing with that problem uh very directly in the currentresearch that we do called human forest which is we're focusing so i'm going to answer your question with an example so rightnow our current research focuses on it's called human forest and it focuses on the area of clinicaltrial transitions for new drugs right the process of a drug a really goodthing to be investigating right now exactly and and and we did it for verytemporal reasons we're in the middle of a pandemic uh you know there's a lot of change happening in the uh in thetraditional uh uh rollout of how a drug goes through from development to approval and tomarketing in the public domain so so what happens in that space you wouldassume lots of people assume that well if you're setting up an environmentwhere you're asking people to make predictions on will drug x you know uh transition from phase two to phase threetesting or will drug y uh receive clinical approval to market the drug byx date right these are all great things you can forecast on most people if youask them it would say oh the people who would perform the best in that type of a contest would likely be people who workin the life sciences clinicians doctors pharmaceutical executives researchersthat would be a very logical and rational expectation on the part of mostpeople uh the reality is is that that's not what we see in the research and and thathas a lot to do with how do you present the information how do you format thatinformation what kinds of additive information are you offering to those people what we've seenis that our non-experts lay people people have zero professional backgrounds in the life sciences biologymedicine medical research uh these people actually outperformed experts and sothis goes back to you know and again you know if phil has done a lot of that foundational workin establishing that you know there are a lot of reasons why experts don't always get it right uh and much of thathas to do with certain psychological issues um if you are in a small field of expertsuh consensus matters because that involves reputational riskif you go out on a limb get it wrong regina let's let's hold off on that for a second because i want to make sure wedelve deep into the blind spots and the cognitive bias uh after we go to breakthis is super interesting stuff let me let's take us to break just for for those of you that justjoined us so we're talking to regina joseph the cognitive scientist um and uha super forecaster from the school of international futures we'll be right back after this breakwelcome to breaking banks the number one global fintech radio show and podcasti'm brett king and i'm jason henricks every week since 2013 we explored the personalitiesstartups innovators and industry players driving disruption in financial services from incumbents to unicorns andfrom cutting edge technology to the people using it to help create a more innovative inclusive and healthyfinancial future i'm jp nichols and this is breaking bankshi and welcome back you're listening to the futurists with brett king and me rob terczyk and today our guest is reginajoseph and she's a cognitive science researcher with an expertise in forecasting super relevant topic for ourshow in the previous half we were talking about methodologies and some of the background and some of the reasoninghow a group of forecasters and people are interested in led by a psychologistExpertise can lead to blindspots when forecastingnamed philip tetlock started to notice something that's really important which is that expertise doesn't always meanthat you're going to make accurate forecasts sometimes expertise brings with it a bunch of institutional blindspots and those blind spots actually when you start to measure out the results really track the results theforecast over time we can see is that sometimes experts are no better or moreaccurate than flipping a coin in fact the 50 50 forecast uh track record ispretty good for most experts one of the people who established that early on is daniel kahneman the nobel prizewinning psychologist and researcher into cognitive bias and of course famously the author of thinking fast and slowWhy do experts get things so wrongnow regina this part of the show what i want to talk about and i think brett and i are both keenly interested in is howdo people get it wrong how do these people who are so smart so steeped in the information and so knowledgeableabout the subject matter how do they have blind spots what are the cognitive biases that get in their way i've seenit so often in the banking space for example you know i was just in switzerland last weekwith the swiss bankers association and i'm meeting bankers that say no no no people are always going to prefer to touh do banking with a human and i'm like that's not even true now let alone in the future right but yeah how do we howdo we get around those blind spots regina well i think overconfidence is is is oneof the most common problems uh that that uh uh people who wish to make forecasts need to overcomeuh i think that and and and certainly kahneman identified this that you knowwe everybody thinks they're good at forecasting the natural tendency that we have wethink that our decision making is pretty good our process is pretty good the reality is is that when you put that tothe test yeah most people are pretty lousy it's like people who trade stocks theyalways tell you about the stocks they picked that went up but they never talk about the stocks that they you know where they blew it they got itcompletely wrong and then over time in their own mind because they're telling that story over and over againwhat they start to believe is that all of their picks are good that they're quite good at this uh so it's it'sreally astonishing to me that the um we blind ourselves you know by repeating this story over and over again one ofthe things that uh daniel kahneman did so well was reveal the heuristics i'm not sure if i'msaying that exactly right but this idea that there are mental shortcuts we take because it's quite difficult to actually think about your thought process and sowe always take a shortcut some of those heuristics include things like the availability heuristic you know whereyou use the most recent example and that's your kind of baseline if you will it's like a fake baseline for predicting the next thing that's going to occur butit's not there's not enough historical continuity there for that to be valid another one we talked about a moment agois confirmation bias where we make a decision about what's going to happen then we search for information that supports our opinion rather thandisprove it it's like the opposite of the scientific method regina can you tell us a little bit more aboutcognitive bias well um i i think that the i think the simplest way to talk aboutit is that we we all have them um and they are very very difficultuh even when you are aware of them uh they're very very difficult to mitigateand so uh it does require really understanding how to identify them uhmost people really don't know uh if you if you ask somebody to identify ageneral bias that that sort of factors into their thinking most people i think would hem and hawabout what what does that even mean um you know so so uh just being able toidentify what are common biases that all of us are are sort of subject to confirmation biasoverconfidence uh hindsight bias um so so just even being able to know what arethe types of um mental actions that we typically tend toundertake when we're making decisions that's a good start one of the things i found so striking in both convon's workWhat would make a political election forecast more accurateand also in philip tetlock's book super forecasters is um is about political beliefs or political convictions and sohere's a question for the audience to consider who do you think would make a more accurate forecast about a politicalelection uh uh someone who's who's a very staunch advocate for one party orthe other who's you know extremely committed to politics and particularly to a particular party or someone who's relatively neutralthat's a good question to ask and i know you know the answer regina so tell us about that and how that actually works outum yeah we i think the 2016 election was a fantasticuh use case uh for for uh examining that and and actually my my research partnerand i uh we we did exactly do that we published a story in the washington post about umwho got at least wrong um you know because everybody got it right right all the pundits were wrong a hundred percentof them were completely incorrect not a hundred not a hundred percent but it was pretty close i mean uh i tell youi got it wrong me too most people did and you know i mean if you want a funny story i umyou know i'm a native new yorker uh uh uh the work that i do uh i i do a lot ofwork with with governments in europe and i was in the office of a seniorpolitical adviser um and we were talking about the upcoming 2016 electionand uh and he said well you're from new york you know you know hillary's gonna win right and i said well i just cannotpossibly imagine you know that and and this was my new york native biascoming out i thought there's simply no way uh that uh uh donald trump couldbecome the 45th president of this country because certainly you know people will come touh see him in the same light that we in new york see him right the majority of newyorkers see him in a very particular light and uh so that was and so i was saying this to the advisor and both ofus were sort of shaking our heads thinking yeah you know it's very unlikely that he's going to winabout three weeks after i had that conversation was the uh fbi announcementabout going back in to investigate hillary's records i changed my forecast on that but notclosely enough not close enough to the temporal scope not close enough to the end to get my score at a really goodlevel but that was an example when i saw the same advisor maybe a few months laterboth of us had at least at that point in time when we were discussing it both of us werewildly wrong that was a clear example of my bias in presuming that everybodywould see things the way i a native new yorker who had long experience with donald trump the way i would see itthat was clearly wrong that was a good lesson for me um and and so when i sawthis advisor a few months later and he said boy we both flubbed that one i said yeah we certainly flood that oneeven my adjustments towards the end was not significant enough you know to really make it a good forecast you'rebringing up a very good point too which is that where you are like physically on the planet is actually going to affectyour perspective right so you're in new york i'm here in the super liberal bubble of los angeles and so we have adistorted it's hard to understand that but we do live with a distorted view because everybody around us thinks the same wayand by the way the same thing's true in the red states right so in the red states it's unthinkable you know that donaldtrump could have lost the most recent election um because everybody around them was pretty much a fan and so theyyou know they saw widespread signs of support and and so one of the things you have to get in the habit of uh is is checkingyour geographic uh place you know like how how is that blinding you uh whoyou're with the people you surround yourself with one of the ways to counter the cognitive bias is to align yourself with peoplewho can challenge your thought process and i know you work with a business partner pavel for that very reason it'slike two brains are better than one and even you know in the super forecasting technique it's a group it's always agroup uh so talk a little bit about how other people can help us see our own blind spotsyeah i um you know uh when i one of the things that i think uh uh works reallywell in environments especially you know most decision making takes place in groups in teamsso learning what is a good process to arrive at somekind of a predictive insight when you have to do it with a bunch of people andwhat we know is that yes diversity and thought makes the collective intelligence uh uh uh more powerful inmany cases but there is also a step that you need to take before you get to thatcollective discussion which is independent estimation it's better even if you're operating in a team it'sbetter that every single individual within that group makes their own independent estimateabout what they think is going to happen before they start talking with other people so that they are not allowingpotential group think bias or anchoring bias or other types of biases i mean if if i'm working and ithis happens a lot you know i'm usually i often find myself the only woman in a room or one of very few numbers of womenin a room so there are biases associated with that the minute i walk before i even open my mouth there are going to beperceptions about that so how do you circumvent those kinds of biases how doyou circumvent those kinds of problems that arise in the accuracy space so the first step is make independentestimations uh this also comes back to team selection regina so you know i mean i you you'vetalked about this um you know in terms of the uh the forecasting teams for uhayappa and so forth um but you know how do you go through that process ofteam selection do you purposely pick people without domain expertise forexample or um you know as a as a as a referenceum at the experimental level we uh do random assignments so so we we to tomaintain the ability to really detect whether or not our systems are working or not we usually do notuh pre-select people to be in certain groups however uh in the case of uh some of theresearch that we're doing it's not who we want to be in specific cohorts what we're looking at is to see how thesystem applies to certain types of samples so in our case what we're doing is we haveone group made of people who are super forecasters people who have been shown to bevery consistent very consistently accurate over an extended period of time so that's onediscrete group we have another discrete group of people who are life sciences professionals people who are experts inthis field and then we have people who are total laypeople zero forecasting experience and zero uhbiomedical experience but those folks actually have uh have some elements in common some traits in commonthey read widely they read eclectically they don't hold very fixed political beliefs as i kind of alluded to in theprevious comment or question you know they're flexible thinkers and i think it's also important for people tounderstand they're not fixated on their prediction they're perfectly willing to change their prediction as you explainedyou know with the 2016 election super forecasters are quite comfortable saying oh new information has come inand therefore we're going to modulate a little bit we're going to moderate and change our our prediction our forecast a little bitbased on that new information and and some people are too fixated like they're too rigid i made my forecast and i'm gonna stick to it and that's almost arecipe for going wrong so so i think even if the folks are not experts in a particular subject matter they do sharesome common traits even if they come from divergent backgrounds yeah and i think that the key thing isthat it's a trainable skill so so yes super forecasters have a propensity to do this naturallybut for people who don't uh you can teach them well in fact that's what you're doing now can you talk a littlebit about your program because this is a good opportunity for those who are listening to learn that this is actually something you can you can you canimprove tell us about it thanks uh uh yeah so so uh wellbasically since 2012 i've been developing training programs and how toget people who don't have uh this natural kind of super forecast orpropensity but who need to be able to make good decisions good forecasts umteach them a step-by-step process on how to be a better forecaster how to thinklike someone who makes forecasts professionally how to do that well uh and much of it isreally about practice uh it's you you need to get them in an environment where they can just make a forecast becausemost people have never done that before just they haven't done it they haven't done in a disciplined way i mean we make forecasts every day we decide whatclothes to wear whether or not to bring an umbrella you know people do that on kind of an intuitive level but they don't think about their process and oneof the keys to becoming a better forecaster is to start to expose your thought process and become familiar withit this is the idea of thinking about thinking and all the authors we've mentioned so far they write deeply aboutthis because it takes a great deal of skill it's also very hard for people to figure out how to think about theirthought process is that one of the things that you train people on regina yes um so metacognition or thinkingabout thinking is is essential metacognition i love that then yeah and and so so uh what we really aretrying to get people to do is to uh take a step back uh you know go intothat system two mode of thought uh that daniel kahneman talks about and uh to really uh be careful aboutwhere they might uh where they uh might fall down uh in in their process andwhere they can boost that process and one of the things that is so importantfor us both as researchers as well as people who are who have a job of making people better forecast and providinghighly accurate forecasts is getting people used to this idea of umthe the the simple act of um training in an environment whereyou know it goes back to a little bit of what you were saying robert um when when when i described for peoplewho've never done forecasting before the first part is to get them to frame it differently every single decision thatwe make every single one of them it's a bet on a future outcome it's a bet on something is yet to happen so everythingthat we do decision wise ultimately could be regarded as a forecast we just don't think about it that waythat's a great way to put it and the the science fiction offered david brin who someday will get on the show he is verybold about saying i'll place a bet on that he's perfectly willing to put his money where his mouth is about his forecast he's always challenging peopleon social media to place put their money where their mouth is it's a good idea and one of the groups of people thattetlock identified that are consistently very good at this are people who invest in stocks uh thepeople who have been successful with that again they tune their bets they're constantly you know adjusting theirposition based on new information they don't just buy it and hold it all the time so so there's a certain set of skillsthat can be taught and that's an interesting thing so for those who are interested uh what url should they go to regina to find out about learning how tobe a better forecaster uh so they can go to uhthe easiest way to do it is to go to www.pithopytho.ioand if you look at a r e t e rnt that's where you can signup and we can send you more information about our training programs and about our research uh there are a variety ofthings that we can offer to people if they want to look at it more from the scientific side if they want to justlearn and develop the skill if they go to www.pitho.iothey can find a lot of information about that and you can also reach me on uhtwitter or linkedin um and and i i think we could probably put the addresses onuh uh uh you know yeah people get all that social media bumpcool so joe so regina there were a couple things you talked about before we started recording that were super interesting and remaining time tell usWhat is the anticipatory intelligence movementabout the anticipatory intelligence movement and the workshop and what you're doing with the national sciencefoundation so i think that there's definitely a uhwell there's certainly a group of people who have been working on these problems in a variety of different placescoming at it from uh different perspectives but still maintaining the same focus on a lot of what we'vediscussed in this last hour um the work of people like phil tedlock and bart nellers and and daniel kahneman andhow we interpret what uh they learned in their research and how we're adapting it astime moves forward again uh what we're looking at in our work has a lot to do with thehybridization of human intelligence with machine intelligence and you know thishas a lot of uh this has a lot of potential ramifications uh for our safety oursecurity our knowledge um i think we've seen certain examples where it doesn'tgo right um i i think um uh the the key is to get peopleand this is the really hard part um is so much of this is about giving peoplethe sense of learning a process learning a high quality systemthat has a lot to do with uh developing a taste for nuance and uh that is a tough thing uh forpeople to develop uh there are a lot of factors that make it harder for people to do thatso taste for nuance is what you're saying uh example so i think that uh uh when when thepandemic hit uh last year i think there was uh so much confusion so much chaosuh that people were uh um they were focusing a lot of attention oncertain types of drugs because they were in the press a lot so you would see those you would see thosebrand names uh you would see those company names before that i mean most people would would never talk aboutsomething like uh recombinant dna or you know crispr gene editing or genomicsequencing people talk about it now uh but back then you know there was such a there was sucha a lack of nuance around what was happening mostly because of fear andpanic uh disinformation campaigns and people recommending drinking bleach andso forth you know there was just a lot of bad information and it was hard for people to sort through it and the scientific process takes a long time thefda moved notoriously slowly to come out with any kind of pronouncement so people weren't sure where to get their guidancefrom that's a big issue is the media environment around us influences us you know if you see 10 headlines saying thatmark zuckerberg thinks that facebook will be the metaverse yeah it wouldn't be surprising for most people to arriveat the conclusion that that's probably going to happen even if it's just a press release you know even if it's just a concerted press push it's widelyunderstood in politics but it's also true for companies companies are trying to craft a perception about where they're going to be in the futurethey're trying to influence shareholders and the stock market and so they put out these they put out they used sort ofmedia disinformation campaigns to kind of influence people's thinking a big part of the work that you do is to kindof illuminate that and show people hey your media diet is going to influence the thoughts that are inside your headand eventually those ideas are going to take root you might start to believe them whether or not they're based on any kind of factyou know what you mentioned the pandemic and i have to ask you this question because it always this has been on my mind so nothing was easier to forecastliterally nothing was easier to forecast than covet 19. for 20 years we've beenhearing from everybody in the field of epidemiology that there was going to be another kind of global breakout of somekind of highly communicable disease and and we very well develop plans on how totackle it that's exactly right there were teens ready although they had been kind of deactivated in the us in some cases but you know like laurie nadalwrote a book called the next plague in like 2003 or fours and it's been around for a long time there were books that came out just a year before and even uhone scientist had said look it'll be a coronavirus it'll come from a bat in the wuhan area like this was not thisdoesn't require a rocket scientist you could just read what was already published and understand this was comingso here we had a case where there were excellent forecasts available but the people in charge ignored theforecasts so what's that called i think of that as like the cassandra syndrome you know where you've got somebody outside of the temple telling youexactly what's going to happen don't go there it's going to be bad and then agamemnon is like nope send the ships off to troy we're going so what do youthink of that i love that we've got so many sort of greco-roman and you know sort of classical referencesyou know flowing through this conversation um you know i i think thatthat is part of the problem right is is is is that's one small part of the problem but as to the reasons whyleaders decision makers don't follow through on copious amounts of evidence or copious amounts of data that they'vegot sitting in front of them to make the right decision there's so many variables that affectthat actual decision-making process that yes cassandra complex is part of itbut there are other you know political liabilities right uh uh of personalincentives right whether it's greed or power or you know so so it's not justabout um you know the the sort of cognitive factors at stakelike uh overconfidence or uh uh there are so many layers uh to adecision especially at a high level that you you know and what we try to do is tokind of decompose that uh if if we're kind of looking back at forecasts that go wrongwhat are all the possible pathways where the decision making took a wrong youknow to took the wrong side of the fork um and so so that's a very complex process and part of what we'retalking about now part of what we're thinking about is is is how do we make getting through that process a littleeasier but i often find that in decision making at a high level it's really down tousually it's the person who has the most money most power most seniority they dowhat they want they're often not as easily influenced as you think they might befantastic well that's uh i maybe let me just um finish with one question um so we canwrap this up regina um you know if if you're talking to the average person out there todaycould you give them a list of actions to take or ways to change their theirlifestyle so that they're better placed for the future yeah uh uh i think that the the firstthing is to be informed uh uh being well informed really is the cornerstone of of of being a goodforecaster or just making good decisions uh is is to be well informed to andupdate yourself on that information it's not enough to know a fact one day and then just sortof let it alone until you have to make a decision years later eclectically across the many differentdisciplines like people get stuck in a bubble right they get a habit and then they get reinforced uh okay go ahead i'm sorry i'm interruptingyou oh no it's it'd be diverse your thought uh uh it's it's be well informedbe diverse in your thought have a process right and and that can start very easily with something assimple as you know learning how to make a decision table uh you know where you are basically evaluating what are thewhat's the trade-off i'm making if i have to make a decision where are the trade-offs then you're you just have to score whichtrade-offs are the worst ones you know which are the least uh uh the least acceptable adversetrade-offs that i have to make uh just being able to do that is a goodstart um you know and and so what we do is to try to provide processes by whichpeople can at least learn how to do that stuff quickly and easily and the more you practice it the more you putyourself in an environment where you are testing yourself and tracking yourself because againmuch of what we talk about is about uh you know we we go back to you know there are lots of people out there inthe world who say that they're futurists but if they aren't tracking themselvesin an environment where they can definitively say yes for the last 10 years i have had a consistent trackrecord and being able to accurately predict here's my briar score you know this is my performance in this year andthis year and this year until you do that you know i think that we fall into the trap of calling peoplefuturists um you know people who are probably not that accurate not not that good atit so so when we talk about the track record issue yeah i think that we need to be puttingmore futurists in the environment where okay can you put your money where your mouth is you're absolutely right reginawe're going to need a lot more futurists in the future at least future-minded people who we're trying to reach with this program because the world ischanging fast and it's really important for people to develop their own methodology for navigating through thatfast changing world now folks you're listening to the futurists and our guest today has been regina joseph she is asuper forecaster she was actually one of the top performing super forecasters in that iarpa program we talked about thebeginning of the show she's also a cognitive science researcher and a geopolitical analyst and you can learnmore about her at her website p-y-t-h-o dot io and you can also take acourse there and learn how to become a super forecaster yourself now you've been watching or listening to thefuturists with brett king and me rob terczyk and we will see you in thefuture [Music] well that's it for the futurists thisweek if you like the show we sure hope you did please subscribe and share it with people in your community and don'tforget to leave us a five star review that really helps other people find the showand you can ping us anytime on instagram and twitter at futurist podcastfor the folks that you'd like to see on the show or the questions you'd like us to ask thanks for joining and as always we'llsee you in the future [Music]