AI, My Bracket… and Why Advice Still Wins

AI, My Bracket… and Why Advice Still Wins

Way back, it seems like the Stone Age, I had convinced myself I had found an easy way around some of my college math requirements. I took a bunch of computer programming classes.  At the time, and I have no idea if that is the case now, programming was new enough that it lived in the math department.  So instead of Trig, advanced calculus, etc., I took BASIC, PL1, COBOL, and Pascal.  It seemed like a smart move.  I even remember one of my first “end of year” projects.  I built a program in BASIC that prompted the user to input data, and it would generate a simple 1040 tax return. Yes, I think I invented TurboTax. 

But then came the summer internship. And one summer was all it took. I can still picture it pretty clearly. The computer programmers were buried in the basement of an office building, with no real sunlight, long hours in front of a blinking green screen, and I slowly started to resemble exactly what you would expect. Pale, frustrated, gaining weight, and wondering how I got there. It didn’t take long to realize this probably wasn’t my path.  But I did take something from that experience that has stuck with me ever since.

Garbage time

My boss for the internship had a saying that he used all the time. Garbage in, garbage out. You put in bad information, you get out bad answers.  Typically, this was the kind of thing you hear and move past. But it turns out it applies to a lot more than just programming and has always stuck with me.  If the inputs weren’t right, the output didn’t matter. In fact, it could look right while being completely wrong, which is probably worse.  Fast forward to today, and that same idea keeps coming back to me as I watch everything happening around AI.

Bracketology

You can’t go a day without hearing about it. ChatGPT, Claude, and a dozen others. There’s a lot of excitement, and if we’re being honest, a little bit of concern from people who wonder where it all leads or if it will replace them. So, I decided to test it in a way that felt familiar.

Instead of filling out my NCAA bracket the way I normally would, this year I asked AI to build one for me. I gave it pretty specific prompts. I wanted something that had a few well-placed early upsets (McNeese?).  I wanted a winner that wasn’t a number one seed, etc. (Iowa State?).  I wanted a chance to win but not be a clone of others.  It sounded reasonable, thoughtful even. It took a while, even longer than it should have, because it kept messing up the actual parings.  But I persevered, and we got it done.  But one day in?  I’m sitting at the bottom of our office pool.  Not middle of the pack, dead stinking last. Garbage in, garbage out. 

That doesn’t mean AI isn’t impressive because it is. It can do things quickly and at a scale that we haven’t seen before. It can help automate, organize, and take a lot of friction out of day-to-day work. But ultimately, it still depends on the inputs. More importantly, it depends on the understanding behind those inputs. And that’s where I think a lot of the conversation misses the mark, especially when it comes to financial advice.

Can you program subtlety?

I have said to many, AI doesn’t understand nuance. It doesn’t understand how someone feels about risk versus how they say they feel about risk or their capacity for risk. It doesn’t understand what it means when a client says they want a second home, but “has” to make sure their kids get through college without debt. Those tradeoffs aren’t always logical on paper, but they are very real in practice.

It also doesn’t pick up on the small things. Hesitation in a conversation or a change in tone. The story about how someone grew up around money (or didn’t) and the feelings that are ingrained. Those moments matter more than most people realize, and they rarely show up in clean inputs.

That’s why I don’t see AI as something to be worried about replacing advisors. I see it as something that good advisors will use to get better. There’s real value in using it to automate parts of the business, to take better notes, to create more consistency, and to free up time. But all of that should lead to more time spent where it actually matters, across the table and in real conversations. Understanding what really drives decisions.

Because at the end of the day, this has never just been about getting to the right answer. It’s about helping someone feel confident in the path they’re on, even when that path isn’t perfectly optimized. And that’s not something you can fully outsource.

I like AI. Even if it currently has me sitting at the very bottom of the bracket standings. It’s useful, it’s powerful, and it’s going to be part of how we all work going forward. But that old lesson still holds up.

Garbage in, garbage out.  When it comes to financial advice, the most important inputs are still human.

Similar Posts