...
Discover the Vibrant World of Emily the Artist

How Instagram’s AI Destroyed My Art Career Before It Even Started

How Instagram’s AI Destroyed My Art Career Before It Even Started

My name is Emily Leeds, and I’m a 21-year-old senior at Pratt Institute studying 3D Animation. On November 9, 2025, both my Instagram and Facebook accounts were permanently disabled. Meta claims I violated their policies on “child sexual exploitation, abuse and nudity.”

The problem? My accounts contained nothing but my art portfolio—digital illustrations, 3D animations, character designs, and concept art that I’ve been building since I was 13 years old. Professional work. Student work. The kind of art you’d see from studios like Pixar or DreamWorks.

In one automated decision, nearly a decade of professional networking, portfolio building, and career development vanished. No warning. No explanation. No real human review. Just an algorithm that decided my art was somehow illegal content.

This is my story, but it’s also a cautionary tale about what happens when automated systems replace human judgment, and when tech companies provide no meaningful way to correct their mistakes.

What Happened

On November 9th, I tried to access my Instagram account (@emily.hithere) and found it had been disabled. The notification said my account violated Community Standards regarding “child sexual exploitation, abuse and nudity.” My immediate reaction was confusion, then panic, then disbelief.

I’ve been using Instagram as my primary portfolio platform for years. My account had somewhere between 2,000 and 3,000 followers—art directors, fellow artists, potential employers, former clients, friends from the animation community, and professional connections I’ve made throughout my time at Pratt. It wasn’t just social media. It was my professional network, my portfolio, my resume, and my primary tool for finding work in an industry where “share your Instagram” is the standard first step in any job application.

My Facebook account was disabled simultaneously, cutting me off from personal connections, family, friends, and additional professional networks.

What My “Offensive” Content Actually Is

You can see my portfolio here. Judge for yourself whether this looks like content that should be permanently banned under policies meant to protect children from exploitation.

My work includes:

2D Digital Illustrations: Stylized character art, often with anime and fantasy influences. Think colorful, expressive characters in imaginative settings.

Character Design Sheets: Professional concept art showing character turnarounds, expressions, and costume variations. This is standard industry work that every animation student creates.

3D Animation Projects: Academic work and personal projects involving 3D character modeling, rigging, and animation. The kind of technical work that requires years of training and specialized software.

Architectural Renderings: Environment designs and structural visualizations created for coursework.

Concept Art: Fantasy characters, creature designs, and imaginative scenes typical of the animation and gaming industries.

Nothing sexual. Nothing exploitative. Nothing inappropriate. Just art.

The “Appeal” Process That Wasn’t

Instagram’s interface offered an appeal option, so I immediately submitted one. I explained that I’m an art student, that my content is entirely professional portfolio work, and that this was clearly an error.

A few hours later, I received Instagram’s response: “We reviewed your account and found that it still doesn’t follow our Community Standards on child sexual exploitation, abuse and nudity.”

The response also stated: “You cannot request another review of this decision.”

That was it. Appeal denied. Case closed. The speed of the response makes it clear that no human actually looked at my portfolio. This was automated from start to finish—an algorithm flagging my content, and another algorithm rejecting my appeal.

Why This Matters More Than You Think

I understand that Meta has to moderate billions of pieces of content. I understand that they use AI and automated systems to do this at scale. I even understand that mistakes happen.

But here’s what I can’t accept: when an obvious mistake happens—when a college art student’s professional portfolio is flagged as illegal content—there’s no way to reach a human being who can look at the actual content and correct the error.

For me, this isn’t just an inconvenience. This is career-threatening.

In the Animation Industry, Instagram Is Your Portfolio

When I apply for internships, jobs, or graduate programs in animation and digital art, the first thing recruiters and art directors ask for is my Instagram handle. Not a PDF portfolio. Not a website link first. Instagram.

Why? Because Instagram shows: – Your most recent work (are you actively creating?) – Your range and versatility (can you work in different styles?) – Your engagement with the art community (do other artists follow and interact with you?) – Your consistency and productivity (how often do you post?) – Your professional network (who follows you and who do you follow?)

Losing my Instagram means losing my primary tool for: – Job applications (I’m applying for positions now as I approach graduation) – Internship opportunities (summer positions are being posted now) – Graduate school applications (portfolio reviews are ongoing) – Freelance work (I’ve gotten several paying clients through Instagram) – Professional networking (connections with industry professionals) – Community engagement (feedback, collaboration, learning from other artists)

I’m not exaggerating when I say this could derail my career before it starts. And I have no recourse.

The Broader Problem

I’m sharing my story not just because I want my account back (though I desperately do), but because this represents a much bigger problem with how content moderation works on platforms that millions of people depend on for their livelihoods.

The problems with this system:

1. Automated decisions with no human oversight: AI makes mistakes. That’s understood. But when the AI is wrong, there should be a way to get a human to review the actual content.

2. No meaningful appeal process: The “appeal” is just another algorithm. There’s no option to reach a real person who can look at my portfolio and see that this is obviously an error.

3. Disproportionate impact on artists: Content moderation algorithms frequently flag legitimate art—especially figure drawings, character designs, and fantasy art—because they can’t distinguish between artistic content and actual violations.

4. No consideration of context: A professional art student’s portfolio should be evaluated differently than random uploaded content. Context matters, but algorithms don’t understand context.

5. Permanent consequences with no due process: My accounts are permanently deleted. All my content will be erased. Years of work and connections gone, with no opportunity for genuine review.

What I’ve Tried

Since the automated appeal failed, I’ve:

– Submitted appeals through every available Meta form I could find – Researched the Oversight Board (but it requires an active account to submit, which I no longer have) – Sent a formal legal notice to Meta invoking arbitration provisions in their Terms of Service – Reached out to press and media to highlight this issue – Created this blog post to document what happened

I’m not trying to fight Meta as a company. I’m just trying to get someone—anyone—to actually look at my content and recognize the obvious error.

How You Can Help

If this story resonates with you, or if you believe that tech platforms should have better systems for correcting algorithmic errors, here’s how you can help:

1. Share this story – Share on Twitter, LinkedIn, Reddit, or wherever you have a voice – Use the hashtag #AIGotItWrong to connect with others who’ve experienced similar issues – Tag @instagram and @Meta to bring attention to the problem

2. Support better content moderation policies – Artists, especially those working with the human form, are frequently caught in overzealous content moderation – Advocate for human review options when automated systems make obvious errors – Support right-to-appeal legislation that would require platforms to provide meaningful review processes

3. If you work in tech or AI – Consider how your content moderation systems impact legitimate users – Build in human review checkpoints for edge cases – Design appeals processes that actually involve human judgment

4. If you’re a journalist – This story represents a broader trend of algorithmic content moderation harming legitimate users – I’m happy to speak with reporters who want to cover this issue – Contact me at emyizamazin@gmail.com

5. If you work at Meta – Please escalate this to someone who can conduct an actual human review of my portfolio – Help me get my professional network and career tool restored – Use this as a case study for improving your appeals process

Why I’m Optimistic Despite Everything

I believe this is solvable. I believe that if a human being at Meta actually looks at my portfolio—really looks at it—they’ll immediately see this was an error and restore my accounts.

I believe Meta is capable of building better systems that balance the need for content moderation with fairness for users.

I believe that enough attention on this issue can create pressure for better policies that protect both children (which everyone wants) and legitimate artists (who are currently being caught in the crossfire).

But I also believe that won’t happen unless people speak up. So I’m speaking up.

A Message to Other Artists

If you’re reading this because you’ve experienced something similar—legitimate art being flagged, accounts being disabled with no real recourse, automated appeals that go nowhere—please know you’re not alone.

Document everything. Save all your correspondence with the platform. Take screenshots of your content before it’s deleted. Seek legal advice if appropriate. Tell your story publicly if you’re comfortable doing so.

The more we share these stories, the more pressure there is for platforms to build better systems that don’t treat artists as collateral damage in the fight against actual harmful content.

My Work Continues

Despite losing my Instagram, I’m still creating. I’m still pursuing my career in animation. I’m still a student at Pratt, working on my thesis project and preparing for graduation.

You can see my portfolio at emily.art/gallery featuring 2D Blender animation, 3D digital artwork, and digital still art. You can contact me at emyizamazin@gmail.com. I’m still here, still making art, still building toward a career in this industry I love.

Instagram was a tool, and it was an important tool, but it’s not the only tool. I’ll rebuild. I’ll find other ways to share my work and connect with the community.

But I shouldn’t have to. And the next artist caught by this system shouldn’t have to either.

That’s why I’m sharing this story. Not just for me, but for every artist who’s been wrongly flagged, every creator who’s lost their platform to an algorithmic error, and everyone who believes that automated systems should be accountable when they make mistakes.

Thank you for reading. Thank you for caring. And thank you for sharing this story if it resonated with you.

— Emily Leeds 3D Animation Major, Pratt Institute Brooklyn, NY November 10, 2025

Connect with me: – Portfolio: emily.art/gallery – Email: emyizamazin@gmail.com – Press inquiries welcome

Update log: This page will be updated with any developments in this situation, including Meta’s response (if any) and media coverage.

Last updated: November 10, 2025

Picture of Emily Sarah Leeds
Emily Sarah Leeds

Emily Sarah Leeds is a dedicated writer with a passion for sharing valuable insights and practical advice. With a wealth of experience in field, Emily's work aims to inform, inspire, and help readers navigate various topics with confidence.