Cecilia Conde Finds Inspiration Everywhere

Antfood_Cecilia Conde_Lab_Color
by Jean Pflum, Cecilia Conde

Jean Pflum, Communications at Antfood, sat down with Cecilia Conde, Innovation Specialist to talk all things innovation, AI, and inspiration.

My name is Cecilia. I'm the Strategy and Innovation Specialist here at Antfood. I'm originally from Rio in Brazil, and now I'm based in New York so I work out of the Antfood New York office.

My background is in symbolic systems. It's a mix of cognitive science, design and AI and I've always gravitated towards work that blends creative expression with emerging technologies and at Antfood I get to live right in that intersection every day.

Antfood_Cecilia Conde_Strategy and innovation specialist
Cecilia Conde
Strategy & Innovation Specialist

It seems like you have a pretty vast background in not only tech, but also how systems affect behavior. What inspires you in that realm? What inspired you to get into it?

What inspires me in this realm is the sense of possibility. We've seen so many times in the history of the world, how technology has opened up new forms of creativity that didn't exist before. And I feel like we often focus on the negative changes that technology brings in terms of what it takes away from people. But I feel like there's a lot to notice in what it brings and all of the doors it opens up for creativity and creative expression.

There's the aspect of accessibility in that a lot of these tools and technologies have allowed people that don't necessarily have all of the technological know-how to be able to express their artistic selves or artistic ideas. And I feel like that's one of the things that really excites me. If we even take a step back from AI, just like technological tools, they democratize creation in ways that I think are interesting. Like I wonder if we decentralize where all of the great musicians are coming from and if they're not all coming from the same schools, I wonder what it is that people actually wanna produce and listen to. I just think it's curious about how it can decentralize and kind of democratize all aspects of creation.

It seems like you're less excited about what's being made and more excited about the fact that there's a whole new set of tools for people to make in general.

Exactly. Because I think still, like for me, the focus and the core is craft. Human creativity is still at the core. What intrigues me is not necessarily what we can do without human creativity, but rather how these tools and all of these things that we have available now can enhance that and open new doors in human creativity.

I don’t get too excited about the ideas that only seek to monetize existing products more by taking the human out of the loop. That’s the source of a lot of fear for people, but where a tool can dramatically scale up dramatically sale the output of one studio or one output – that’s where it excites me. Because you’re maximizing the time you have to the limit, and opening up net-new avenues for creation.

What does innovation look like in your current role?

Innovation at Antfood isn’t about chasing hype. We’re not trying to do what’s trending this week, because there’s rarely a creative reason to do that. Instead, we use new tools in service of storytelling and craft.

Especially with AI, the team has been experimenting continuously: composers, producers, strategy, and tech working closely together, but always with taste, intention, and ethical consideration. The goal isn’t to replace creative decision-making or generate final outputs automatically, but to support the creative process in ways that feel additive and human.

One way the team approaches this is by treating generative tools as thought starters rather than answers. They help us explore ideas faster, surface new directions, or challenge assumptions early on, while the core creative decisions still come from the composers and creative leads.

We’re also actively exploring how to develop tools trained on our own internal IP: shaped by the musical language, taste, and values of Antfood. That work is very much a collective effort: it reflects how our composers write, how our team gives feedback, and how we think about sound emotionally and culturally. The goal isn’t to replicate everything that’s ever existed, but to build systems that think with the team and extend the way we already create together.

I really like what you said about using these tools to be conversation starters. Are there any conversations that have started through these tools that when you had these conversations you were like, “yes, this is what we're trying to get at?”

I think of these tools as conversation starters in a very literal sense, not because they give us answers, but because they surface questions we wouldn’t have asked otherwise.

One kind of conversation they start is internal and ethical. The moment you introduce generative tools into a creative workflow, you’re forced to ask: What are we actually comfortable with? What are we protecting — our IP, our values, our taste? And where are we willing to be open? Those questions now show up much earlier in the creative process than they used to, and that shift alone is meaningful.

But the more exciting conversations, for me, happen inside the creative room. We use generative tools not to replace composition or decision-making, but to quickly explore a wider space of possibilities. You can put something rough on the table almost instantly: a texture, a mood, a direction, and suddenly everyone has something concrete to react to. That accelerates taste-based discussions: Why does this feel right? Why does this miss? What would make it more human, more intentional?

There have been many moments where a tool generated something imperfect, even wrong, and that was exactly the point. It clarified what we cared about. It helped us articulate values that are usually implicit. In that sense, the output isn’t the product.

A composer might react immediately by saying, “this is too literal,” or “this loses the emotional arc,” and suddenly we’re articulating things that are usually intuitive or unspoken.

In those moments, the output isn’t the product. The value is in how the Antfood team responds to it together, how composers, producers, and creatives use it as a foil to sharpen their instincts, align on taste, and clarify what feels true to the work. That’s when these tools are doing their best job: never leading the creative process, but helping the humans in the room see more clearly what they care about.

I'm curious about if there was a time where. You were all sitting around the conference table and because of one of these tools, a conversation got started where you were like, yeah, that's exactly what I was trying to do.

Yes, that kind of moment happens quite often, actually.

Any time you introduce new tools that meaningfully change how work can be done, you surface a certain amount of skepticism. That’s natural - and I usually try to get at that tension early. Challenging an established way of working is harder for people who’ve spent years refining a particular process, compared to those entering the field alongside these technologies.

What’s been encouraging is that at Antfood, leadership approaches that tension with optimism rather than defensiveness. One concrete example is a project we’re currently exploring that involves making parts of our internal IP and music catalog more client-facing, and potentially more open.

That immediately sparked a meaningful conversation around a real tradeoff: how much do we prioritize protecting our IP versus expanding its reach? Those two goals are linked. Greater reach inevitably brings greater exposure, but limited reach can also reduce the cultural and creative value of the work itself.

The tool didn’t give us an answer, but it did force us to articulate what we actually care about. It made the tension more visible. And that’s exactly the kind of conversation I’m interested in starting: not “can we do this,” but what does doing this say about our values?

I think we’re reaching a moment where the decision-making norms in the music and creative industries are shifting. Many leaders are reevaluating what protection really means in a world where digital material is inherently fluid. That uncertainty doesn’t make the conversation less important, it makes it more urgent.

What excites me is that we’re not just adopting new tools; we’re actively reshaping the ethical frameworks around them, in real time, through thoughtful, often uncomfortable discussions. That’s where meaningful innovation actually happens.

I think that makes a lot of sense. And I guess this is something I've always personally wondered because obviously there is such an ethical conversation happening around these things. Was that something that you studied in tandem with your tech study?

I was always especially deeply interested in Ethics, and ended up doing a kind of minor in Philosophy. There were the mandatory parts of studying CS, engaging with ethics and tech policy (like where training data comes from, how representative it is and how biases get reinforced as systems scale) - but I was especially drawn to the philosophy of different ethical frameworks. Ethical frameworks seem abstract, but they’re actually pretty visible all around us - companies and individuals tell us about their ethical frameworks in how they use AI. Some might be optimizing towards a better output at all costs, while others care deeply about protecting artists. That perspective still shapes how I think about AI today, ethical questions don’t sit at the end of the process, they’re embedded in how systems are designed from the start.

Do you think that also your experience with being a musician and being a singer has informed how you approach new tools like this?

Absolutely, my experience as a musician has deeply shaped how I think about these tools.

Growing up, I was always drawn to very ambitious music. I listened to the best singers in the world, people like Barbra Streisand and Ella Fitzgerald, Elis Regina, and I was more interested in the ideas behind the music than in whether I could technically execute everything perfectly myself. My taste often ran ahead of my skill, and that tension stayed with me.

What excites me about today’s tools is that they help close that gap, not by lowering the bar, but by making translation easier. If you have a clear creative intention, you can now experiment, iterate, and learn much faster. You don’t have to fully master every technical step before you’re able to explore an idea meaningfully.

That doesn’t replace craft, I think it actually attempts to reinforce it. The better your taste, the clearer your intention, the more powerful the tools become. They reward people who know what they’re aiming for, even if they’re still learning how to get there.

For me, that’s what democratization really means: not removing expertise, but expanding who gets to participate in the early, exploratory stages of creation, where curiosity, intuition, and emotional clarity matter most.

In the same way that it is honing a craft to learn to play piano or guitar, do you think there is a craft to be learned in prompting and interacting with AI tools?

It’s interesting, because a lot of people expect “prompting” to become a kind of standalone skill, almost like its own discipline. I don’t fully see it that way.

I do think it helps to understand, at a high level, how these systems work — how they interpret inputs, where their limits are, and how to evaluate their outputs. That context makes you more precise in how you ask questions and more critical in how you assess the results.

But for me, the “craft” isn’t the prompting itself, it’s the judgment around it. Knowing what to ask, when to push further, when something feels off, and when human intervention is needed. The tools move faster, but responsibility doesn’t disappear.

That perspective is very much shaped by how we work at Antfood. Our composers already operate at a very high level of craft: they’re making decisions based on emotion, narrative, and subtle shifts in tone. Because of that, AI naturally takes on a different role: it’s not the thing that defines the craft, it’s something that supports exploration around it.

For example, a tool might help surface variations or directions more quickly, but it’s the composer who decides what feels intentional, what serves the story, and what gets discarded. In that sense, the value of AI is proportional to the clarity of human taste guiding it.

So I don’t think of prompting as something to master in isolation. I think of it as part of a broader creative discipline, one where strong judgment, musical intuition, and responsibility remain central, and the tools are there to help us move through ideas more fluidly, and make ideation more scalable, not replace the craft itself.

It seems like there's a lot of excitement around all of these new tools, especially at Antfood. Does any of it make you nervous?

I think I would be lying if I said that none of it makes me nervous, but I think it's healthy to stay thoughtful. But I try to stay thoughtful rather than nervous. What keeps me grounded is a strong belief that the human component will always remain essential, especially in creative work.

I studied human-centered AI, and a lot of that intersects with philosophy of mind. Some theories treat the human mind as essentially a system trained on accumulated data — perception in, behavior out. That’s not a view I personally subscribe to. I don’t think humans are limited to their training data in the way machines are.

Humans extrapolate. We imagine. We feel. Creativity often comes from making leaps that aren’t strictly logical: from emotion, intuition, memory, and lived experience. A model can learn correlations about the color red, but it doesn’t know why red feels warm, or why a certain harmony suddenly feels nostalgic. It doesn’t feel the temperature of the fire.

I’m reminded of that every day working with the composers at Antfood. You can make the smallest adjustment to a piece, a subtle harmonic shift, a change in pacing, and suddenly everyone in the room feels it. Clients will say things like, “this feels sadder than last time,” even if they can’t explain why. That kind of sensitivity and shared emotional recognition is deeply human.

Because of that, I try not to see AI as a threat to creativity, but as something that can expand what humans are able to explore and express, as long as it’s used with intention. And that’s something we’re very conscious of at Antfood: using these tools to support human judgment, not replace it.

I also don’t think we’re particularly close to AGI ( artificial general intelligence) in the sense of recreating a human mind in all its dimensions. AGI is an interesting theoretical horizon, but the gap between pattern generation and human experience is still very real. That’s why, for me, the focus stays on the present: building tools that work with human creativity, not instead of it.

Do you think AGI is a noble pursuit?

I think it’s a meaningful question, but I’m not sure I’d frame AGI as a noble pursuit so much as a revealing one.

I studied philosophy of AI at Stanford, and one idea that really stayed with me is that much of human progress is driven by a desire for continuity: to survive, to extend life, to preserve ourselves in some way. When you look at AGI through that lens, it starts to resemble a modern expression of that impulse: an attempt to create something that outlasts us, mirrors us, or carries human intelligence forward.

I don’t think that makes the pursuit inherently good or bad. It makes it human. We tend to push toward the limits of what’s possible simply because we can, and because curiosity and ambition are deeply embedded in how we operate as a species.

What matters more to me than the end goal is how we pursue it — whether we stay thoughtful about what we’re optimizing for, and whether we remain clear about the difference between building powerful systems and preserving the distinctly human qualities that give those systems meaning in the first place.

So while AGI is an interesting theoretical horizon, I like to keep my focus much closer to the present: building tools that extend human capability, not replace it, and making sure our values evolve alongside our technology rather than lag behind it.

Thank you so much. Anything else you wanna say?

You asked me about what inspires me.

I also am inspired by how sound shifts and how people interact with it. Be it brands or people in moments.

When one of our composers here, Bennett, wanted to get a specific outcome for a track to achieve a specific area of the tonal map, he would be very intentional about it. It would be a very subtle shift. And then the clients would sit here and listen to it and be like, “oh yes, this feels sadder than last time.” That really proves to me, the power of music and how it is a really powerful communication tool.