HomeAbout MeBook a Call

Automating Earnings Call Sentiment Analysis with Gemini Pro

By Vo Tu Duc
March 29, 2026
Automating Earnings Call Sentiment Analysis with Gemini Pro

Earnings call transcripts are goldmines of strategic financial context, but extracting actionable insights from dense, jargon-heavy text is a massive big data challenge. Discover how to move beyond manual workflows and transform unstructured corporate speak into structured, queryable intelligence at scale.

image 0

The Challenge of Processing Earnings Call Transcripts

For modern financial institutions, earnings calls are absolute goldmines of unstructured data. While the accompanying press releases provide the hard quantitative metrics—revenue, earnings per share (EPS), and operating margins—the live calls offer something far more nuanced: the context behind the numbers. During these sessions, executives provide forward-looking guidance, explain strategic pivots, and, most importantly, face unscripted Q&A sessions from analysts.

However, extracting actionable intelligence from these events presents a massive data engineering and processing hurdle. Transcripts are dense, often spanning 10,000 to 15,000 words per call, and are heavily laden with industry-specific jargon, financial nomenclature, and carefully crafted corporate speak. Transforming this unstructured text into structured, queryable insights at scale is a classic big data problem that traditional, manual workflows are fundamentally ill-equipped to handle.

Time Constraints for Financial Analysts

During peak earnings season, the sheer velocity and volume of information create a severe bottleneck. A typical financial analyst or portfolio manager might be responsible for covering dozens of companies, many of which schedule their quarterly earnings calls within the same compressed two-to-three-week window.

Manually listening to a one-hour webcast or reading through a 20-page transcript takes significant time—time that analysts simply do not have when the market is actively reacting to the news. The latency between a call concluding and an analyst publishing their research note or adjusting a financial model can mean the difference between capitalizing on alpha and entirely missing a market movement.

image 1

Furthermore, human fatigue inevitably sets in. When an analyst is reviewing their fifth transcript of the day, their ability to accurately parse complex financial data, cross-reference historical statements, and maintain a high level of analytical rigor degrades. The industry requires a mechanism to ingest, process, and summarize this massive influx of text with near real-time latency, freeing analysts to focus on high-level strategic decision-making rather than manual data extraction.

The Need for Automated Narrative Tracking

While quantitative data can be easily scraped and ingested into relational databases, the qualitative data—the narrative and sentiment—is notoriously difficult to track. Financial markets are driven just as much by executive tone and sentiment as they are by raw numbers. Is the CEO sounding “cautiously optimistic” or “defensive” during the Q&A? Are management’s explanations for supply chain headwinds consistent with what they stated two quarters ago?

Tracking these subtle narrative shifts across multiple quarters and across different companies within a sector is virtually impossible to do manually with any degree of consistency. Human analysts carry inherent cognitive biases; their interpretation of a CEO’s tone might be influenced by market conditions on that specific day or their preconceived notions about the company.

This creates a critical need for automated narrative tracking. By leveraging advanced Natural Language Processing (NLP) and Large Language Models (LLMs), organizations can establish a standardized, objective baseline for How to build a Custom Sentiment Analysis System for Operations Feedback Using Google Forms AppSheet and Vertex AI. Automated systems can instantly detect shifts in executive confidence, flag evasive language during Q&A sessions, and categorize thematic narratives (e.g., AI investments, macroeconomic pressures, or regulatory hurdles) across thousands of transcripts. This systematic approach transforms subjective human interpretation into quantifiable, trackable data points that can be seamlessly integrated into algorithmic trading models and long-term investment strategies.

Building an Automated Financial Summarizer

Transforming dense, multi-page earnings call transcripts into actionable sentiment data requires a robust, scalable architecture. Rather than relying on manual review or disjointed third-party tools, we can build a seamless, serverless pipeline entirely within the Google ecosystem. By bridging Automatically create new folders in Google Drive, generate templates in new folders, fill out text automatically in new files, and save info in Google Sheets with Google Cloud’s advanced AI capabilities, we can create a summarizer that is both highly intelligent and deeply integrated into your daily financial workflows.

Core Logic and Workflow Design

The foundation of our automated financial summarizer relies on a straightforward, event-driven architecture. The goal is to minimize human intervention: once a transcript is acquired, the system should handle the rest. Here is the step-by-step workflow design:

  1. Ingestion and Triggering: The pipeline begins in Google Drive. We designate a specific “Earnings Transcripts” folder. Whenever a new transcript (in PDF, Google Doc, or plain text format) is uploaded to this folder, a time-driven AI Powered Cover Letter Automation Engine trigger detects the new file.

  2. Text Extraction: Upon detection, the script reads the file. If it’s a Google Doc, the script extracts the body text directly using the DocumentApp service. For PDFs, Google Drive’s built-in OCR capabilities can be leveraged during the file retrieval process to convert the document into readable text.

  3. Prompt Assembly: The raw text is then combined with a meticulously engineered prompt. This prompt instructs the AI to act as a seasoned financial analyst, asking it to extract key metrics, identify forward-looking statements, and score the overall sentiment (e.g., Bullish, Bearish, or Neutral) based on executive tone and Q&A interactions.

  4. AI Processing: The assembled payload is sent via API to Gemini Pro. Gemini processes the extensive context of the transcript, reasoning through the financial jargon and nuanced executive commentary.

  5. Structured Output: Gemini Pro is instructed to return the analysis in a structured JSON format. The script parses this JSON and appends the data—such as Company Name, Quarter, Sentiment Score, and Key Takeaways—as a new row in a centralized Google Sheet. This Sheet can then serve as a live database for downstream data visualization in Looker Studio.

Leveraging Gemini Pro and Apps Script

The true magic of this solution lies in the synergy between Genesis Engine AI Powered Content to Video Production Pipeline and Gemini Pro.

Architecting Multi Tenant AI Workflows in Google Apps Script serves as our serverless orchestration layer. Because it is natively embedded within AC2F Streamline Your Google Drive Workflow, it eliminates the need to spin up external servers, manage authentication tokens for Drive/Sheets, or configure complex IAM roles just to move data between documents. Using standard JavaScript, we can control the entire flow. The UrlFetchApp class is the workhorse here, allowing us to make secure HTTP requests directly to the Vertex AI or Google AI Studio endpoints.

Gemini Pro is the analytical brain of the operation. Earnings calls are notoriously long, often exceeding 10,000 words. Gemini Pro’s massive context window makes it uniquely suited for this task, allowing it to ingest the entire transcript in a single pass without the need for complex text-chunking or vector database setups. Furthermore, its advanced reasoning capabilities excel at reading between the lines—for instance, detecting the subtle hesitation in a CFO’s response during the Q&A session, which often holds more weight than the prepared remarks.

Here is a conceptual look at how Apps Script bridges the gap to Gemini Pro:


function analyzeTranscriptSentiment(transcriptText) {

const apiKey = PropertiesService.getScriptProperties().getProperty('GEMINI_API_KEY');

const endpoint = `https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent?key=${apiKey}`;

const prompt = `

You are an expert financial analyst. Read the following earnings call transcript and provide a JSON response with the following keys:

- "sentiment": (Bullish, Bearish, or Neutral)

- "key_drivers": (Top 3 factors driving the sentiment)

- "summary": (A 2-sentence executive summary)

Transcript: ${transcriptText}

`;

const payload = {

"contents": [{

"parts": [{

"text": prompt

}]

}]

};

const options = {

'method': 'post',

'contentType': 'application/json',

'payload': JSON.stringify(payload),

'muteHttpExceptions': true

};

const response = UrlFetchApp.fetch(endpoint, options);

const jsonResponse = JSON.parse(response.getContentText());

// Extract and return the generated text

return jsonResponse.candidates[0].content.parts[0].text;

}

By combining the low-code agility of Apps Script with the enterprise-grade intelligence of Gemini Pro, Cloud Engineers can deploy a highly sophisticated financial analysis tool in a matter of hours, transforming static text into dynamic, market-ready intelligence.

Extracting and Processing Transcript Data

Once the earnings call concludes and the raw transcript is generated, the real engineering work begins. In this phase, we must build a robust, scalable ingestion mechanism and prepare the unstructured text for our large language model. Leveraging Google Cloud’s serverless ecosystem ensures that our pipeline can handle massive spikes in data volume—a common occurrence during peak earnings seasons—without requiring manual intervention or complex infrastructure management.

Exporting Transcripts into the Pipeline

To automate the flow of financial data, we need a reliable, event-driven entry point. A highly effective architectural pattern is to utilize Google Cloud Storage (GCS) as our initial landing zone. Whether your transcripts are automatically fetched via a third-party financial API, or synced directly from a collaborative Automated Client Onboarding with Google Forms and Google Drive. environment using the Google Drive API, the destination remains a dedicated, secure GCS bucket.

By configuring Cloud Storage triggers via Eventarc, every new transcript upload automatically generates an event. This event seamlessly routes through Pub/Sub to trigger a serverless compute instance, such as a Cloud Function or a Cloud Run service.

Inside this processing layer, the raw text is ingested and sanitized. Earnings transcripts are notoriously noisy, often filled with boilerplate safe-harbor statements, operator instructions, and formatting artifacts. The Cloud Function strips this extraneous metadata. Furthermore, while Gemini Pro boasts a massive context window, strategically chunking the text by speaker or by presentation versus Q&A session can significantly improve the granularity and accuracy of the downstream sentiment analysis.

Configuring Gemini for Growth and Risk Keywords

With clean, structured transcript data flowing through our pipeline, the next step is routing it to Vertex AI to harness the reasoning capabilities of Gemini Pro. The secret to extracting actionable financial intelligence lies in precise Prompt Engineering for Reliable Autonomous Workspace Agents and strict model configuration. We aren’t just looking for a generic “positive” or “negative” sentiment score; we need to isolate specific, forward-looking business indicators.

First, we configure the Gemini Pro model parameters for analytical rigor. Setting a low temperature (e.g., 0.1 or 0.2) minimizes hallucination and ensures highly deterministic, consistent outputs—a strict requirement when dealing with financial data.

Next, we design a system prompt that explicitly instructs the model to act as an expert financial analyst, focusing on two primary extraction targets:

  • Growth Keywords and Themes: We instruct Gemini to identify, extract, and contextualize phrases related to market expansion, successful product launches, margin improvements, bullish revenue guidance, and strategic acquisitions.

  • Risk Keywords and Themes: Conversely, we direct the model to flag macroeconomic headwinds, supply chain bottlenecks, inflationary pressures, regulatory hurdles, or downward revisions in guidance.

To ensure this data integrates smoothly into the rest of our pipeline, we leverage Gemini’s structured output capabilities. By passing a predefined JSON schema in our Vertex AI API request, we force Gemini to return its analysis as a machine-readable JSON object. This payload includes arrays of extracted quotes categorized strictly as “Growth” or “Risk,” alongside a calculated sentiment severity score for each. This structured response is then effortlessly parsed by our Cloud Function and streamed directly into BigQuery, ready for complex querying and dashboard visualization.

Generating Actionable Insights in Google Sheets

Once Gemini Pro has digested the dense, unstructured text of an earnings call transcript and extracted the nuanced sentiment, the next critical step is making that data accessible. Raw JSON responses sitting in a cloud logging console won’t help your financial analysts, portfolio managers, or executive teams. We need to bridge the gap between advanced AI processing and everyday business operations. Google Sheets serves as the ideal presentation layer for this pipeline, transforming raw sentiment scores and extracted quotes into a dynamic, filterable, and highly collaborative dashboard.

Structuring the Output Data

A well-architected spreadsheet is the foundation of any robust automated report. To maximize the utility of Gemini Pro’s analysis, we must map the AI’s output directly to a logical, predefined schema in Google Sheets.

When prompting Gemini Pro, leveraging the response_mime_type: "application/json" parameter is a best practice. By passing a strict JSON schema in your system instructions, you guarantee the model returns key-value pairs that perfectly align with your spreadsheet headers. A highly effective column structure for earnings call insights includes:

  • Timestamp: The date and time the analysis was run.

  • Company & Ticker: The entity being analyzed (e.g., “Alphabet Inc. (GOOGL)”).

  • Fiscal Quarter: The specific reporting period (e.g., “Q3 2023”).

  • Overall Sentiment: A categorical label (Bullish, Bearish, Neutral) or a normalized numerical score (-1.0 to 1.0).

  • Executive Tone: A brief classification of management’s delivery (e.g., “Cautiously Optimistic”, “Defensive”).

  • Key Bullish Drivers: An array of positive catalysts extracted by Gemini, joined into a readable string.

  • Key Bearish Drivers: Extracted risk factors, headwinds, or missed targets.

  • Forward-Looking Guidance: Specific mentions of future revenue projections or strategic pivots.

By structuring the data this way, you empower your team to easily build pivot tables, generate time-series charts of sentiment trends across multiple quarters, and cross-reference AI-derived sentiment shifts with actual market movements.

Automating the Final Sheet Report

With the data schema defined, we can automate the final leg of the pipeline using Google Apps Script or a Google Cloud Function leveraging the Google Sheets API. For a seamless, native Automated Discount Code Management System integration, Apps Script provides an incredibly efficient execution environment.

The automation logic follows a straightforward sequence:

  1. Fetch and Parse: The script receives the JSON payload generated by the Gemini API. Using JSON.parse(), it extracts the specific data points mapped to our column structure.

  2. Append Data: Using the SpreadsheetApp service, the script targets the specific destination sheet and utilizes the appendRow() method. This ensures that every new earnings call analysis is neatly added to the bottom of the dataset without overwriting historical records.

  3. Visual Formatting: To make the insights instantly actionable, the script can programmatically apply conditional formatting. For example, using the ConditionalFormatRule builder, the script can automatically highlight the “Overall Sentiment” cell in green for “Bullish” and red for “Bearish”, allowing analysts to gauge the temperature of a call at a single glance.

To make this a true zero-touch pipeline, this script can be bound to a Google Drive event trigger. The moment a new earnings transcript PDF or text file is dropped into a designated Drive folder, the trigger fires, Gemini Pro processes the text, and seconds later, a beautifully formatted, sentiment-analyzed row appears in your team’s shared Google Sheet, ready for the morning briefing.

Scaling Your Financial Analysis Architecture

While running a local JSON-to-Video Automated Rendering Engine script to query Gemini Pro is an excellent proof-of-concept, enterprise-grade financial analysis demands a robust, highly available, and secure infrastructure. To truly capitalize on automated sentiment analysis for earnings calls, you need to transition from a standalone script to a fully integrated cloud architecture.

As a Cloud Engineer, I recommend building an event-driven pipeline leveraging the broader Google Cloud ecosystem. Here is how you can scale this solution:

  • Automated Ingestion: Utilize Cloud Storage as your data lake. Whenever a new earnings call transcript or audio file is uploaded (either manually or via a third-party financial API), it triggers an event.

  • Serverless Processing: Use Eventarc to route this upload event to a Cloud Run service or a Cloud Function. This serverless compute layer will automatically invoke the Gemini Pro API via Vertex AI, passing the transcript along with your carefully crafted sentiment analysis prompts.

  • Structured Storage: Instead of outputting to a terminal, the serverless function parses Gemini’s JSON response—extracting bullish/bearish indicators, executive tone, and key forward-looking statements—and streams this structured data directly into BigQuery.

  • Workspace Integration: To bridge the gap between backend engineering and frontend analysts, integrate the pipeline with Automated Email Journey with Google Sheets and Google Analytics. Using Google Apps Script and the BigQuery API, you can automatically generate formatted summary reports in Google Docs, populate tracking sheets in Google Sheets, or even trigger real-time alerts in Google Chat if Gemini Pro detects a sudden spike in negative sentiment regarding revenue guidance.

By utilizing serverless components and managed AI services, this architecture scales automatically during the peak of earnings season and scales down to zero when the market is quiet, ensuring optimal cost-efficiency.

Key Benefits of Data Driven Workflows

Transitioning your financial analysis to a cloud-native, AI-driven workflow fundamentally changes how your organization interacts with market data. By automating the extraction of sentiment from earnings calls, you unlock several strategic advantages:

  • Unprecedented Speed to Insight: Human analysts take hours to read, digest, and summarize a 50-page earnings transcript. A Gemini-powered pipeline processes the same document in seconds, delivering actionable sentiment scores to your dashboard before the market even has time to react.

  • Elimination of Cognitive Bias: Human interpretation of executive tone can be subjective and influenced by fatigue. Gemini Pro applies your predefined analytical framework consistently across every single transcript, ensuring that a “cautiously optimistic” tone is scored with mathematical consistency, quarter after quarter.

  • Quantitative Transformation: By storing LLM outputs in BigQuery, you transform qualitative, unstructured text into quantitative data. This allows your quantitative analysts (quants) to backtest trading strategies against historical sentiment scores, correlating executive tone shifts with subsequent stock price movements.

  • Enhanced Analyst Productivity: Rather than spending earnings season buried in transcripts, your financial analysts can focus on high-value tasks: interpreting the aggregated data, refining investment theses, and making strategic portfolio decisions based on the automated insights.

Book a Discovery Call with Vo Tu Duc

Implementing a scalable, AI-driven financial architecture requires more than just API access; it demands a deep understanding of cloud engineering, security best practices, and prompt engineering. If your organization is ready to modernize its financial analysis workflows and harness the full power of Gemini Pro, Google Cloud, and Automated Google Slides Generation with Text Replacement, expert guidance is just a click away.

Book a discovery call with Vo Tu Duc today. As an expert in Google Cloud and enterprise AI architectures, Vo Tu Duc will work with your team to evaluate your current data pipelines, identify integration points, and design a custom, secure, and highly scalable sentiment analysis solution tailored to your firm’s specific investment strategies.

Stop reading transcripts and start analyzing data. Schedule your consultation to take the first step toward a smarter, faster financial workflow.


Tags

Sentiment AnalysisGemini ProFinancial AIEarnings CallsData AutomationNLP

Share


Previous Article
Automating Financial Compliance With a Regulatory Watchdog Agent
Vo Tu Duc

Vo Tu Duc

A Google Developer Expert, Google Cloud Innovator

Stop Doing Manual Work. Scale with AI.

Hi, I'm Vo Tu Duc (Danny), a recognised Google Developer Expert (GDE). I architect custom AI agents and Google Workspace solutions that help businesses eliminate chaos and save thousands of hours.

Want to turn these blog concepts into production-ready reality for your team?
Book a Discovery Call

Table Of Contents

Portfolios

AI Agentic Workflows
Change Management
AppSheet Solutions
Strategy Playbooks
Cloud Engineering
Product Showcase
Uncategorized
Workspace Automation

Related Posts

Agentic Telecom Subscriber Onboarding Automating CRM and Provisioning
March 29, 2026
© 2026, All Rights Reserved.
Powered By

Quick Links

Book a CallAbout MeVolunteer Legacy

Social Media