future-thinkers

POV

roadmap

after MVP is complete, these are the currently drafted next features

Not necessarily in order, but a list of features that are planned to be added after the MVP is complete. We should definitely come back to this and re-order and fill in with new understandings.

premium features flow

how can we make this elegant to manage / disclose possible features?

  1. this is the foundation for most of the following features, hopefully using the base Apple ecosystem as much as possible.

Expanded scene blocks

  1. New types:
    1. Dialog
      1. Voiceover by default
    2. Parenthetical
      1. Thoughts by default
  2. Tags for each scene block
    1. Example: “Starbucks - Morning (9am) #me #work” where “me” and “work” signify buckets of life to help understand where my focus/demand is at that time.

storyboards

Just like threading on replies within messaging, storyboards are a way to thread together cues that are related to the top level scene block

  1. When swiping on a scene block, a user can now see a detail with that scene block’s related cues
    1. Example: A scene block with a storyboard, can now attach additional cues like photos, videos, or other journal entries.
  2. When in the Storyboard view, a user can see a historical timeline of changes they’ve made to the scene block
    1. Example: A user can see when they’ve added a photo, video, or any edits to the scene block

highlights

  1. Casts are people in your life
    1. Example: Proper Nouns can all be auto tagged, especially after initial tagging.
  2. Props are items that support
    1. Example: “iPhone”, “Phone”, “My Car” can all be auto tagged.
  3. Locations are tied to scenes
    1. Example: Day One’s auto tracking of interesting locations.

smarter cues

  1. Placeholder: Learn pattern of amount of scene blocks that the person typically has
    1. Example: If a person typically has 5 scene blocks in a scene, then the default amount of cues should be that.
  2. Calendar pulls in events to understand main focus for a time block
    1. Current assumption is these graduate from smart cue data to slug lines, which are definitely tied to a time.
  3. Location pulls in places you’ve been to understand transitions that have ocurred
  4. Photos/Videos pulls in photos to understand visual context of a time block
  5. Health Activity, not just workouts, but also potentially clusters of steps/heart rate
  6. Sleep activity, to understand that there’s no need for multiple plots, but a great thread of dreams for sleep
  7. Weather, to understand the context of the day/night and potentially severe weather
  8. Social media accounts can bring in your activity to understand what you chose to share with the world
  9. Reminders as context for what was on the agenda and completed for the day. Bonus if tagged to bucket of life.

notifications

  1. NOT DAILY CHECK-INS; but instead, “You just finished a workout, write down a few words about how it went”, or “Hey, you’re cranking out the steps, wanna jot down what’s going on?”

recap

  1. A way to see a summary of the day/week/month/year/all quantifying all of the quantities
    1. Example: “You’ve written 10,000 words this month, 5,000 of which were about work, 3,000 about family, and 2,000 about personal”
    2. Example Week/Month/Year/All: “In general you’ve felt more positive about work than last month, but less positive about family”
  2. Correlations of storyboards and which you tend to update more often
    1. Example: “You tend to update your work storyboards more often than your personal ones, but you tend to write more about personal than work”
  3. Which scenes you come back to in the future
    1. Example: “You tend to come back to your 2 years ago scene blocks more often than your evening ones”
    2. Example: “You tend to look at highlights across people more than things”
  4. Quantifying the categories/buckets of life and how much time you’ve spent on each
    1. Example: “You’ve spent 50% of your time on work, 30% on personal, and 20% on family”

resolutions

Chose this due to the Scrip Writing theme

  1. A way to not only set resolutions, but see how the scene blocks and storyboards are helping you reach those goals
    1. Example: Too many scene blocks, slowly starting to give feedback that there’s too much going on, based on what the user has set
    2. Example: Not enough focus on a specific bucket of life, slowly starting to give feedback that there’s not enough focus on “me” or “work”

export

TBD

whimsical ideas…

  1. On load: Animated rendering of Slug Lines to fill in the in between like a script being written on
    1. Example: Typewriter-esque animation of the Slug Lines being written in, then the action, then the transitions
  2. If the launch screen looks like a ton of pages of script, and the first view is the same, we zoom in to one of the pages and start there
    1. Could be a good mix of above
  3. Changing the type of scene block should have a fun animation (also Apple Pencil Pro when tilting)
    1. Example: Swiping right on a scene block changes (as you drag) between the types of scene blocks, similar to Paper by 53’s undo or tool selection
  4. Rachel’s awesome chart thing, using accelerometer to make it bounce/move as you move the device, subtle, but there!

notes

Currently thinking this feature can happen last… since you need data in here to even go back and have any sort of continuity

emotional state

Rachel has an amazing vision for this, as seen in Figma

  1. currently thinking these will be stored as notes with a note type correlating

deep linking and routing

would be amazing to sync back to calendar events and deep link to the scene block that it relates to

promo codes to unlock features

a great marketing thing. “first 100 links here will unlock XXXX feature(s) for free for life”

Observables

A person can see exactly what’s reported back to us, and choose data controls

  1. Given we choose to only log aggregates (no specific values of what a person input) we show back to them what was sent.
    1. Example: Something like this: https://posthog.com/docs/libraries/ios
  2. If a person is trying to submit a bug report (shake?) then we prompt them to turn this feature on, even temporarily