Your workflow is unique 👨💻 - tell us how you use Dropbox here.
Forum Discussion
MichaelCSMDash
Dropbox Staff
1 hour agoA Token of Appreciation
Hi there!
Michael here from the Dash Customer Success team. As I write I’m currently sipping my coffee on a breezy morning in Austin, Texas. March is right around the corner, which means rodeo season is about to kick into full swing, so yes, I’m dusting off the old cowboy boots.
But boots aren’t the only thing I’ve been dusting off lately. I’ve also been brushing up on something a little more technical: AI tokens. And before you think, “Hold on… based on the title… this isn’t the blog post I expected,” just stay with me. (Clever, right? AI could never 🤣)
The truth is, over the past few weeks we’ve heard more and more customers asking about AI tokens, what they are, how many they’re using, and most importantly, how that ties back to cost. So let’s talk about it.
In the AI world, a token is basically a chunk of text, sometimes a word, sometimes part of a word, sometimes even punctuation, that a model reads and processes. When you type a prompt, the model doesn’t see “sentences” the way we do. It sees tokens. The size of your prompt and the length of the response are measured in tokens, and that measurement directly impacts cost, speed, and performance. For example one prompt could be :
“Create a structured end-to-end project plan for [PROJECT NAME], including timeline, milestones, task breakdown, ownership, risks, and success metrics. Present it in a clean, leadership-friendly format.”
How many tokens was that? That’s about 30 to 40 tokens. However, this is where it gets tricky. The more you add sources or content, the more tokens are used. And as AI evolves, a prompt you used today could be a different amount of tokens in the future. Does your head hurt yet? Me too.
So why is this coming up now? Hearing from customers, we think it’s because AI products are being priced based on token usage per user. For example, someone in Engineering might need 10K tokens per week, while someone in Marketing may only need 5K. That difference starts to matter when pricing and budgets are tied directly to consumption.
As a result, admins are now trying to forecast token usage by role, estimating how many tokens marketing versus engineering will realistically need. There is growing anxiety around what happens if users exceed their limits, particularly whether overages will unexpectedly drive up costs. In some cases, this could mean closely monitoring token usage, potentially even daily, just to avoid budget surprises. That level of oversight is not something most teams want to manage long term.
So what does this all mean?
Tokens are not something to fear, but they are something to understand. The more intentional you are with prompts, context, and usage, the more control you have over cost and performance. You do not need to count every token, but you should know that they are the currency powering everything behind the scenes. As AI pricing continues to evolve, you can:
- Ask questions and understand how your platform measures usage
- Ask teams how they are using AI today
- Make sure your teams are trained on writing efficient prompts
At the end of the day, tokens are just the meter running in the background. The real focus should be on outcomes. If the value outweighs the cost, you are doing it right.
How is your team approaching token usage today? I’d love to hear what’s working, what’s confusing, and what questions you’re still trying to answer.
PS - No tokens harmed in the writing of this blog post😉
- Michael T.
No RepliesBe the first to reply
About Dash Forum
Ask questions and get support on how to use Dash.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!