Skip to content

Epics and issues

Create a hierarchy of tasks for your team.
Help your team focus by breaking down larger initiatives into smaller tasks with clear deadlines.
← Click here to get started.
A few notable definitions.
To keep product launches on track, many teams rely on tools that help them manage the following:
💎 Epics - Product planning often starts with teams identify larger initiatives, or epics.
✅ Issues / tasks - Actionable items, called “issues,” are added to epics so more granular tasks can be assigned to specific owners.
👩🏻‍💻 User stories - Features and functionalities stated from the perspective of the user/customer—their goals and their expectations of the product.
🗓 Timeline - A clear view of when epics and issues are due to avoid deadlines from slipping, and help mitigate any blockers that may arise. You can update dates/times directly on the Timeline by dragging the bars directly on timeline.
This template lets you do all of the above in the same document and uses , which make it easy to add Coda building blocks, embed images, comment, and collaborate—all from inside a cell. This means you won’t need to switch to a separate page or doc to start writing.
How to use this template
1. Make a copy of this doc for you and your team. You only need to do this once a quarter or year, depending on your product cycles.
Copy doc
2. Clear all the sample data.
Clear all sample data
3. Head to the
Customizations_sukC8
page to personalize this doc.
If you use Jira to track issues.
You can pull in and sync your data to Coda instead so you can visualize progress and tie issues to your project strategy write-ups, all in the same doc. Try that comes with a Jira integration built-in.

Timeline view

Automation Backend
Create Lambda to retrieve survey from front end API
Add automatic triggering of Lambda
Extend Lambda to convert coda API data to state machine JSON
Implement DynamoDB for JSON storage
Create Lambda service to retrieve JSON from DynamoDB when triggered
Extend Lambda to return a state from JSON when executed with a current-state context
Attach API gateway to Lambdas
Modify existing/create new Lambda service which receives data from Landbot and stores it in SQL database
Set up development environments for Lambdas
Implement Endpoint Testing on Lambdas
Monitoring
Implement monitoring of Lambda triggers
Implement monitoring in API code according to delta standards
Add metrics to monitor in Landbot
Landbot V2
Create a development environment in Landbot to test new bots
Create a generic bot flow with webhooks that query the Lambda service for survey questions
Redesign the signup flow to streamline and debug the process
Extend bot flow to send data to Lambda service for DB storage via API call
Data Handling
Identify current database issues
Investigate existing database schema
Debug Database or Migrate to new schema
Frontend Redesign
Standardize survey question format
Data Visualisation Template
Onboard onto Tableu
Investigate Google Cloud NLP
Data Cleaning
Implement data cleaning pipeline
Database Functionality Fix
Ensure correct mapping of UserID to LandbotProfileID
Determine Validation Strategy
Update user demographic details from landbot tables
Add Sheets data to SurveyResults
DB participant statuses and balances
Update all user numbers from landbot data
Determine triggers to insert a record into SurveyResults
Copy Transactions table and retool logic
Set up triggers to insert records into transactions table
Test transactions and balances
February 19, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 20, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 21, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 22, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 23, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 24, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 25, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 26, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 27, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
February 28, 2026
12AM
1AM
2AM
3AM
4AM
5AM
6AM
7AM
8AM
9AM
10AM
11AM
12PM
1PM
2PM
3PM
4PM
5PM
6PM
7PM
8PM
9PM
10PM
11PM
TodayFit

Epics

Add new epic
Automation Backend
Description
Overview
This Epic encapsulates the creation of a serverless backend to use with automating the landbot. There are two main services which are required:
A service which listens to updates on the Coda survey form and creates a state machine JSON representation, which is stored in a DynamoDB
Key Owner
Testing Pipeline
Description
Overview
This Epic details the implementation of a testing pipeline to fit in with our automated services, as well as some manual processes.
Business motivation
Key Owner
Data Cleaning
Description
Overview
The goal of this epic is to achieve an automated data cleaning pipeline, between process that interact with the central SQL database and of data that is already within the database. Besides just data type and value correctness, this cleaning refers to the validation of data that respondents’ have provided as their survey answers, and both identifying and handling cases where invalid or fraudulent data has been provided.

Key Owner
Landbot V2
Description
Overview
The Landbot flows need to be cleaned up in order to ensure compatibility and functionality with new services. This epic describes the steps taken to achieve a landbot flow(s) which is fully automated and requires no intervention between surveys. In particular, this focuses on:
Ensuring the sign-up flow is bug free and can scale to add more features for respondents’
Key Owner
Data Handling
Description
Overview
This epic describes the tasks to restructure the SQL database in a way that can be best utilized by all the other systems, as well as implementing that integration and providing backup systems while restructuring occurs

Key Owner
Data Visualisation Template
Description
Overview
The ultimate goal for data visualisation is to be able to fully automate it using machine learning, with insights gained from data provided by respondents and their answer history. However, the most short term gains can be achieved by simplifying a data visualisation template that can reduce the hours needed to construct a report for a client. With a restructuring of the database, this can be integrated.

Key Owner
Monitoring
Description
Overview
Though not mission critical, the stories in this epic all contribute towards a goal of gaining insight into the performance of the business systems. This includes metrics surrounding latency of chatbots, the lead time between ideas and implementation, and other means of identifying performance. At the very least, basic alerts should be created early on to determine failure of key systems. This is ongoing and should be implemented as new features are added

Key Owner
Frontend Redesign
Description
Overview
In order to make an effortless self-service website for our clients, we need to redesign our existing frontend form-builder to provide more utility. This includes redesigning the form builder in coda, or moving it to retool entirely, as well as extending the front end with a web page that allows for user account creation and monitoring of survey process

Key Owner
Database Functionality Fix
Description
Overview
The main goal for this is to make the sql database usable by retool or other services to allow for correct manipulation and observation of the data. Currently, the database contains bugs that prevent it from functioning properly. The objectives are to:
Fix bugs to ensure that the relationship between UserID as a foreign key correctly maps one-to-one with LandbotProfileID as a primary key in the Users table
Key Owner
DB participant statuses and balances
Description
Overview
This epic encapsulates tasks to implementing a means of tracking the status of a participant for a particular survey (unchecked, passed, failed), as well as recording their owed balance and paid status. This includes allowing for easily changing a participant’s status through viewing their answers for a survey, and also being able to present participants’ payments

Key Owner

Issues

Filter issue by quarter:
Add new issue
Quarter
Epic
Issues
Details
Start
Duration
Team(s)
Effort
Create Lambda to retrieve survey from front end API
When Manually triggered ,
5/16/2022
3 days
Add automatic triggering of Lambda
When button pressed on Coda ,
5/16/2022
2 days
Extend Lambda to convert coda API data to state machine JSON
When Lambda receives survey information from API gateway,
5/16/2022
3 days
Implement DynamoDB for JSON storage
When A DynamoDB instance is queried on a Survey ID ,
5/16/2022
3 days
Create Lambda service to retrieve JSON from DynamoDB when triggered
When manually triggered ,
3 days
Extend Lambda to return a state from JSON when executed with a current-state context
When this lambda is executed with a hard-coded current state (for tests) ,
5 days
Attach API gateway to Lambdas
Attach API gateways to the Lambdas in order to communicate with Landbot.
2 days
Modify existing/create new Lambda service which receives data from Landbot and stores it in SQL database
Lambda should receive data from API call and link to a SQL database server to perform queries
2 days
Set up development environments for Lambdas
Create different staging environments for Lambda code to allow for integration/smoke screen tests at each layer. DevOps and Cheslyn in particular can assist with this
2 days
Implement Endpoint Testing on Lambdas
Lambdas should have their inputs and expected outputs mapped at a high level and the QA team can construct test cases for them, which should be used to validate their functionality
1 day
Implement monitoring of Lambda triggers
When the Lambda service is executed ,
1 day
Implement monitoring in API code according to delta standards
When exceptions are thrown ,
2 days
Add metrics to monitor in Landbot
Tracking of determined metrics to monitor the health of Landbot. This can’t be easily done within landbot, so a third party software could be used or a separate flow
3 days
Create a development environment in Landbot to test new bots
Set up a new bot flow that can be used for testing. Includes creating an appropriate development channel and number list to test bots securely, without impacting any users in production.
1 day
Create a generic bot flow with webhooks that query the Lambda service for survey questions
When the bot begins its flow ,
8 days
Redesign the signup flow to streamline and debug the process
The current sign up flow has too many bugs and there are many branches. These branches should be streamlined to reduce redundancy and questions/logic should be implemented to prevent bad data from being input
2 days
Extend bot flow to send data to Lambda service for DB storage via API call
Webhooks should be implemented in Landbot flow to send the stored variables to a Lambda service using API endpoints. The Lambda service is responsible for inserting/updating the SQL Database
1 day
Identify current database issues
Our database is currently not being updated with the right data, we need to identify what is causing this
1 day
Investigate existing database schema
The database schema needs to be investigated to determine if it will meet the needs of the proposed business plan, and if changes need to be made to accommodate it.
1 day
Debug Database or Migrate to new schema
Depending on the nature of the bugs, migrating to a new schema might fix them.
3 days
Standardize survey question format
In order to accommodate the automation backend, the question format of surveys needs to be standardized. This will allow for flexibility in moving between front end software as long as the API calls to the back end service are made in the same format.
1 day
Onboard onto Tableu
Data Studio is proving too slow to work with, so the first step in improving the data visualisation speed is onboarding onto Tableu - a more extensive platform that should increase the time it takes to create a report - before creating a standardized template
2 days
Investigate Google Cloud NLP
Investigate using Google Cloud NLP for sentiment analysis and extraction of entities from Open Text questions
2 days
Implement data cleaning pipeline
Survey data should be cleaned before being inserted into database
3 days
Ensure correct mapping of UserID to LandbotProfileID
There exists multiple entries in the Users table with the same LandbotProfileID. This field is meant to be unique, and is also assumed to be unique by other database operations, causing errors.
1 day
Determine Validation Strategy
Determine how to test and monitor the DB to observe its correctness. How is the functionality determined to be correct once changes are made?
6 hrs
Update user demographic details from landbot tables
With the Landbot data migration, the latest landbot records are stored in the database in separate tables, all organized with rows corresponding to user ID’s and tables corresponding to each of landbot’s internal variables. As these range in the 1000s, transforms will be needed to insert them into the database. The first step is to record the variables present as fields in the Users table, and update those. The other variables can be inserted as individual answers to surveys (if they don’t already exist)
1 day
Add Sheets data to SurveyResults
All survey results statuses are stored in google sheets for the respective surveys, this data needs to be imported into the SurveyResults table so that it is observable from within the database and retool
6 hrs
Update all user numbers from landbot data
Because of a misuse of the lambdas to insert data in the incorrect order, a significant amount of user data appears as NULL in the database. As mobile numbers are needed to make payments for participants, a means of updating this from data imported by landbot needs to be implemented
5/30/2022
1 day
Determine triggers to insert a record into SurveyResults
The SurveyResults table in the database already maps users to their paid and checked statuses for a particular survey, but no data is coming in. A method should be determined to fill this data with information as users complete surveys
1 day
Copy Transactions table and retool logic
The transactions table contains a schema that could be used to process payments, but its incorrect use means that a copy should be created (to avoid removing data) to start afresh
1 day
Set up triggers to insert records into transactions table
A record should be inserted into the transactions table to indicate the balance owed to a participant for completing and passing a survey based on their status in the SurveyResults table. Once a payment has been made, this balance should be negated
2 days
Test transactions and balances
Feature testing before pushing to prod
1 day

Release Milestones

Survey Form to Landbot Flow Automation

This release will see the backend automation structure and functionality implemented in order to coordinate the generic Landbot flow (Landbot V2), which will then also have a front-end survey form builder in place to provide a client with a one-click creation of a survey that is automatically sent out to participants and have their responses recorded.

Data Pipelines and Cleaning

This release will revolve around the treating and handling of data all parts of the overall system to ensure that is is appropriately cleaned and handled to reduce time and errors. While the survey automation release will focus on the back end and the process up until the survey has been delivered to participants, this release will focus more on the processes after they have responded and how their data is handled.

Data Visualisation and UX

This milestone contains the reworking of our data visualisation process to reduce the time it takes to generate a report for a client, as well as implementing other front end systems to increase the quality of a client’s experience with the service.
Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.