College students are lifelong learners. So are AI-powered fraudsters.

With each passing day AI grows more powerful and more accessible. This gives fraudsters the upper hand, at least for now, as they roll out legions of AI-powered fake humans that even governmental countermeasures—such as the Biden administration’s recent executive order—will be lucky to slow down.

Among other nefarious activities, bad actors are leveraging AI to peddle synthetic bank and online sports betting accounts, swing elections, and spread disinformation. They’re also fooling banks with another clever gimmick: posing as college freshmen.

College students, particularly underclassmen, have long been a target demographic for banks. Fraudsters are well aware and know that banks’ yearning for customer acquisition, coupled with their inadequate fraud prevention tools, present an easy cash-grab opportunity (and, perhaps, a chance to revisit their collegiate years).

Early bank gets the bullion

The appeal of a new college student from a customer acquisition perspective can’t be understated.

A young, impressionable kid is striking out on their own for the first time. They need a credit card to pay for both necessary and unnecessary things (mostly the latter). They need a bank. And their relationship with that bank? There’s a good chance it will outlast most of their romantic relationships.

This could be their bank through college, through their working years, the bank they procure a loan from for their first house, the bank they encourage their kids and grandkids to bank with. In a college freshman banks don’t just land one client, but potentially an entire generation of clients. Lifetime value up the wazoo.

Go to any college move-in day and you’ll spot bank employees at tables, using giveaway gimmicks to attract students to open up new accounts. According to the Consumer Financial Protection Bureau, 40% of students attend a college that’s contractually linked to a specific bank. However, as banks shovel out millions so they can market their products at universities, a fleet of synthetic college freshmen lie in wait, with the potential to collectively steal millions of their own.

Playing the part

Today’s fraudsters are master identity-stealers who can dress up synthetic identities to match any persona.

In the case of a fake college freshman, building the profile starts off in familiar fashion: snagging a dormant social security number (SSN) that’s never been used or hasn’t been used in a while. Like many forms of Personally Identifiable Information (PII), stolen SSNs from infants or deceased individuals are readily available on the dark web.

From here, fraudsters can string together a combination of stolen and made-up PII to create a synthetic college freshman identity that qualifies for a student credit card. No branch visit necessary, and IDs can be deepfaked. The synthetic identity makes small purchases and pays them off on time—food, textbooks, phone bill—building trust with the bank and improving their already respectable credit score of around 700. They might sign up for an alumni organization and/or apply for a Pell Grant to further solidify their collegiate status.

Pell Grants, of course, require admission to a college—a process that, similar to acquiring a credit card from a bank, is easy pickings for synthetic fraudsters.

The ghost student epidemic

Any bank that doesn’t take the synthetic college freshman use case seriously should study the so-called “ghost student” phenomenon: fake college enrollees that rob universities of millions. 

In California, these synthetic students, who employ the same playbook as bank-swindling synthetics, comprise 20% of community college applications alone (more than 460K). Thanks to an increased adoption of online enrollment and learning post-pandemic, relaxed verification protocols for household income, and the proliferation of AI-powered fake identities, ghost students can easily grab federal aid and never have to attend class.

Like ghost students, synthetic college freshmen can apply for a credit card without ever stepping foot inside a bank branch. Online identity verification is a breeze for the seasoned bad actor. Given the democratization of powerful generative AI tools, ID cards and even live video interviews over Zoom or another video client can be deepfaked.

A (SuperSynthetic) tale old as time

Both the fake freshmen and ghost student problems are symptomatic of a larger issue: SuperSynthetic™ identities.

SuperSynthetic bots are the most sophisticated yet. Forget the brute force attacks of yore; SuperSynthetics are incredibly lifelike and patient. These identities play nice for several months or even years, building trust by paying off credit card transactions on time and otherwise interacting like a real human would. But, once the bank offers a loan and a big payday is in sight, that SuperSynthetic is out the door.

An unorthodox threat like a SuperSynthetic identity can’t be thwarted by traditional fraud prevention tools. Solutions reliant on individualistic, static data won’t cut it. Instead, banks (and universities, in the case of ghost students) need a solution powered by scalable and dynamic real-time data. The latter approach verifies identities as a group or signature: the only way to pick up on the digital footprints left behind by SuperSynthetics.

As human as SuperSynthetic identities are, they aren’t completely infallible. With a “birds eye” view of identities, patterns of activities—such as SuperSynthetics commenting on the same website at the exact same time every week over an extended period—quickly emerge.

Fake college students are one of the many SuperSynthetic personas capable of tormenting banks. But it isn’t the uphill battle it appears to be. If banks change their fraud prevention philosophy and adopt a dynamic, birds eye approach, they can school SuperSynthetics in their own right.