Can I use an LLM to do that?

Do you have a problem and don't know if AI can help you solve it? Why not ask it? I share a practical example that you can adapt for your purpose.

Can I use an LLM to do that?
Photo by fabio / Unsplash

A colleague from another team at Day Job reached out after my presentation about using LLM the other week as she is exploring the use of AI for her team. I asked her to explain the business problems she's trying to solve - she said one of the key questions is about understanding drivers of customer churn. Over a cup of tea, we discussed how she can use LLMs to run data analysis.

My team has a clear understanding of the problem we're trying to solve and how we can leverage AI in our solution. For other businesses and teams, these may not be as apparent. How do you explore using AI if you're not sure what it can do?

Why not ask?

I am a business analyst and I have a set of customer churn data, I want to use AI to help me analyse the data and generate some actionable insights. Provide me with a step by step guide to use ChatGPT to conduct the analysis, including the best way to share data with ChatGPT and what types of outputs ChatGPT can generate.

Below is the response by GPT-5:

Here’s a practical, business-analyst-friendly guide to using ChatGPT for analysing your product runoff data and turning it into actionable insights.


1. Prepare Your Data

  • Clean it first – remove duplicates, fix inconsistent formats (dates, product names, etc.).
  • Use a structured format – CSV or Excel is best, with clear column headers.
  • Aggregate sensitive info – anonymise customer names or IDs if needed.

💡 Tip: Keep your dataset small enough for ChatGPT to handle directly (a few thousand rows max). For bigger datasets, share a sample plus a summary of key metrics.


2. Share Your Data with ChatGPT

  • Best method: Upload the CSV or Excel file directly here (drag-and-drop).
  • If your data is small, you can paste it in a Markdown table, but that’s slower and more prone to formatting issues.
  • For large datasets, summarise first (e.g., totals by month, top products) and share the summary.

3. Frame Your Analysis Goals

Before you paste the data, tell ChatGPT:

  • What you want to find out (e.g., “Identify seasonal trends in customer churn”).
  • The business context (e.g., “We sell 12 product lines in multiple regions”).
  • The type of output you want (e.g., “Charts, a short report, and recommendations for reducing churn”).

Example prompt:

“Here is customer churn data for the last 24 months. I want you to:Summarise key trends,Identify causes of high churn,Suggest actions to reduce it.”

4. Types of Analysis ChatGPT Can Do

With your data, ChatGPT can:

  • Summarise – total runoff per product/region/month.
  • Trend analysis – detect growth or decline patterns.
  • Segmentation – compare performance across products, categories, or markets.
  • Root cause hints – flag potential causes from patterns in the data.
  • Forecasting – estimate future runoff using simple statistical models.
  • Benchmarking – compare to provided targets or industry averages.

5. Types of Outputs ChatGPT Can Generate

  • Tables – sorted rankings, pivot-style summaries.
  • Charts & Graphs – bar charts, line charts, pie charts (auto-generated here).
  • Dashboards – a set of charts in one place (static images).
  • Written Insights – a “management-friendly” summary of findings.
  • Action Plans – recommended next steps with potential impact.

6. Iterative Refinement

  • Ask follow-up questions: “Can you break down the top 3 products with highest churn by region?”
  • Request different views: “Show this as a monthly trend chart instead of a table.”
  • Drill deeper into causes: “What events in June could explain the spike?”

7. Moving to a Repeatable Process

If you need to do this regularly:

  • Standardise your data format.
  • Save your prompt structure.
  • Use ChatGPT’s Code Interpreter / Python mode to automate parts of the analysis.

If you’d like, I can create a ready-to-use “Customer Churn Analysis Prompt Template” so every time you upload new data, ChatGPT will automatically summarise, chart, and generate recommendations without you rewriting the instructions.

Do you want me to make that next?


Then I thought, what if my colleague has multiple sets of data? So I asked a follow up question.

If I have multiple sets of data with a common key such as customer reference number, can you run analysis across all data?

Yes — if your datasets share a common key (like a customer reference number), I can join them and analyse the combined data for deeper insights.

Here’s how it works step-by-step:


1. Prepare Your Data

  • Make sure each dataset is in CSV or Excel format with consistent key formatting (e.g., CUST001, not cust 1 in one file and C001 in another).
  • Use clear column names — avoid ambiguous headers like “Value” or “Info.”
  • If possible, include a short data dictionary describing each column.

2. Upload the Data

  • You can upload multiple files here in the same conversation.
  • Tell me which columns are the keys (e.g., customer_ref_number) so I can join them.

3. Define Your Analysis Goals

Tell me what you want to uncover across datasets, for example:

  • “Match purchase history (file 1) with complaint records (file 2) to see if high-complaint customers also have higher refunds.”
  • “Merge demographic info with product usage to segment customers by behaviour.”
  • “Correlate marketing spend with customer lifetime value.”

4. What I Can Do After Joining Data

Once merged, I can:

  • Create joined tables combining all relevant info per customer.
  • Run comparisons across data sources (e.g., purchases vs. support tickets).
  • Find correlations (e.g., more interactions → higher spend).
  • Segment customers by multiple factors (region, purchase type, retention).
  • Detect anomalies that only appear when datasets are combined.

5. Handling Large Data

If your datasets are too large to fully upload:

  • Send me samples + key summaries (aggregates, counts, averages).
  • I can then write you Python code to run locally that merges and analyses the full data.

If you want, I can show you exactly how to phrase the upload prompt so that ChatGPT will automatically merge and analyse multiple files without you having to manually explain the process each time.

Do you want me to make that upload prompt for you?


Last but not least, I wanted to know about limitations of using an LLM for data analysis.

Are there limitations of using ChatGPT for data analysis that I should be aware of?

Yes — ChatGPT can be powerful for data analysis, but there are real limitations you should keep in mind so you don’t hit unexpected roadblocks or misinterpret results.


1. Data Size Limitations

  • File size: In the chat interface, you can only upload relatively small files (usually a few MB).
  • Row limits: Practically, a few thousand rows are fine; hundreds of thousands will be too big to process directly.
  • Workaround: Upload a filtered sample or aggregate summaries and, if needed, have ChatGPT generate Python code to run on your full dataset locally.

2. Processing Power & Memory

  • ChatGPT doesn’t “store” the full dataset in memory across multiple chats — once the session ends, the data is gone.
  • For very complex joins, calculations, or ML modelling on large datasets, local or cloud-based tools (e.g., Python, R, SQL, Excel, Power BI) will be more efficient.

3. Accuracy Risks

  • Results depend on data quality — if the data is messy or incomplete, insights can be misleading.
  • AI may hallucinate patterns if the data is too small or not representative.
  • Statistical outputs are often simpler than what a full analytics suite could provide.

4. Statistical Depth

  • ChatGPT can run basic to intermediate stats (averages, correlations, regressions, forecasts), but:
    • It’s not a substitute for specialist statistical software for advanced modelling.
    • It won’t automatically perform significance testing unless you ask for it.

5. Privacy & Security

  • You shouldn’t upload sensitive personal data (e.g., names, credit card numbers, confidential business data) unless anonymised or unless you’re in a secure enterprise environment with proper agreements in place.
  • OpenAI states it doesn’t use your files for training, but you should still follow data governance rules.

6. Visualisation Limits

  • It can generate charts and graphs here, but:
    • They’re static — not interactive dashboards.
    • Complex multi-layer visualisations (like Power BI) are best done externally.

7. Context Loss

  • In long analyses, ChatGPT may “forget” earlier steps if the conversation gets too long.
  • Workaround: Summarise key findings and re-share them as you progress, or keep the workflow modular.

💡 When ChatGPT works best for data analysis:

  • Exploratory analysis on manageable datasets.
  • Quickly generating summaries, visualisations, and insights.
  • Writing reusable analysis scripts for you to run on full data locally.
  • Helping brainstorm interpretations and business implications of trends.

If you’d like, I can put together a "ChatGPT Data Analysis Decision Guide" — a one-pager that tells you when to use ChatGPT directly, when to use it for coding help, and when to switch to specialised tools. That would make it much easier to decide which approach to take for each dataset.

Do you want me to prepare that for you?


The Q&As above is an example of how to explore an LLM's capabilities in solving a problem.

In the initial prompt I followed the RGTD prompting framework.

  • Role: Business analyst
  • Goal: Analyse data and generate actionable insights
  • Task: Provide a step-by-step guide to conduct data analysis
  • Details: How to share data with ChatGPT and what types of outputs ChatGPT can generate.

I plan to share the above with my colleague - I hope she finds it useful as she embarks on her AI data analytics journey.


Since I'm talking about using LLMs for work today, I thought I'd shared a few of my learnings so far (I posted the below on LinkedIn after the presentation).

1. Have a well-defined use case: There is a danger in seeing AI as the solution but not knowing what you're trying to solve. A clear business objective and clarity on how to leverage AI sharpen our focus and minimise distractions in our discovery and solution design.

2. Deep dive into the problem space: This is not unique to AI related work - by investing time in both business and technical discovery, we're developing a solution that is truly fit-for-purpose, as well as incorporating guardrails to ensure key risks are mitigated.

3. Use humans as your guide: If we want AI agents to help streamline a process, it's vital to start with a deep understanding of how it is done now - and not just the happy path. The knowledge of our subject matter experts are vital and we engage with them daily to continually refine our solution.

4. Iterate iterate iterate: I've lost track of how many times I've updated my LLM prompts - but iteration the secret sauce. By continually testing and refining the context and instructions we give to the AI agents, we improve their accuracy and robustness, and we identify opportunities to uplift our current operations - for humans and for AI.


Why don't you ...

Find a problem you're trying to solve and use the RTGD framework to ask your LLM of choice on how it can help.

Until next week.

Vee