ChatGPT conversations still lack timestamps after years of requests

(community.openai.com)

107 points | by Valid3840 3 hours ago

22 comments

  • zbycz 29 minutes ago
    If you download a Data export, the timestamps are there for every conversation, and often for messages as well.

    The html file is just a big JSON with some JS rendering, so I wrote this bash script which adds the timestamp before the conversation title:

      sed -i 's|"<h4>" + conversation.title + "</h4>"|"<h4>" + new Date(conversation.create_time*1000).toISOString().slice(0, 10) + " @ " + conversation.title + "</h4>"|' chat.html
  • Valid3840 3 hours ago
    ChatGPT still does not display per-message timestamps (time of day / date) in conversations.

    This has been requested consistently since early 2023 on the OpenAI community forum, with hundreds of comments and upvotes and deleted threads, yet remains unimplemented.

    Do any of you could think of a reason (UX-wise) for it not to be displayed?

    • Qem 1 hour ago
      > Do any of you could think of a reason (UX-wise) for it not to be displayed?

      I can imagine a legal one. If the LLM messes big time[1], timestamps could help build the case against it, and make investigation work easier.

      [1] https://www.ap.org/news-highlights/spotlights/2025/new-study...

      • azinman2 27 minutes ago
        It’s already in the data export.
    • Workaccount2 3 hours ago
      Regular people hate numbers.

      Not a joke. To capture a wide audience you want to avoid numbers, among other technical niceties.

      • dymk 20 minutes ago
        This makes sense only if you don’t think about it at all.
      • johnfn 38 minutes ago
        Do regular people not use any mainstream messaging app - Messenger, iMessage, etc?
      • smelendez 2 hours ago
        Make it a toggle then, like a lot of popular chat apps?
        • Y_Y 2 hours ago
          There's only one thing they hate more than numbers...
      • vasco 2 hours ago
        > Regular people hate numbers

        What does this even mean

      • bobse 2 hours ago
        Should they be allowed anywhere near a computer?
        • falcor84 2 hours ago
          I actually agree there's an issue here. I feel we've been dumbing down interfaces so much, to the extent that people who in previous generations would barely write and who wouldn't affect anyone outside their close friends and family, now having their voice algorithmically amplified to millions. And given that the algorithms care only about engagement, rather than eloquence (let alone veracity), these people end up believing that their thoughts are as valid regardless of substance, and that there's nothing they could gain by learning numeracy.

          EDIT: It's not a new issue, and Asimov phrased it well back in 1980, but I feel it got much worse.

          > Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge'.

    • intrasight 3 hours ago
      Sounds like an easy browser extension
    • eth0up 22 minutes ago
      My honest opinion, which may be entirely wrong but remains my impression, is:

      User Engagement Maximization At Any Cost

      Obviously there's a point at which a session becomes too long, but I suspect a sweet spot somewhere which optimization is made for.

      I often observe, whether as I perceive or not, that among the multiple indicators that I suspect of engagement augmentation, is also the tendency for vital information to be withheld while longer more complex procedures receive higher priority than simpler cleaner solutions.

      Of course, all sorts of emergent behaviors could convey such impressions falsely. But I do believe an awful lot of psychology and clever manipulation have been provided as tools for the system.

      I have.a lot of evidence for this and much more, but I realize it may merely be coincidence. That said, many truly fascinating, fully identifiable functions from pathological psychology can be seen. DARVO, gaslighting and basically everything one would see with a psychotic interlocutor.

      Edit Mych of the above has been observed after putting the system under scrutiny. On one super astonishing and memorable occasion GPT recommend I call a suicide hotline because I questioned its veracity and logic

  • firesteelrain 2 hours ago
    Just a note to those adding the time to the personalization response. It’s inaccurate. If you have an existing chat, the time is near the last time you had that chat session active. If you open a new one, it can be off by + or - 15 minutes for some reason
    • baby 2 hours ago
      I was using a continuous conversation with chatgpt to keep track of my lifts, and then I realize it never understand what day I'm talking to it, like there is no consistency, it might as well be the date of the first message you sent
      • brap 2 hours ago
        I think that’s exactly why they’re not including timestamps. If timestamps are shown in the UI users might expect some form of “time awareness” which it doesn’t quite have. Yes you can add it to the context but I imagine that might degrade other metrics.

        Another possible reason is that they want to discourage users from using the product in a certain way (one big conversation) because that’s bad for content management.

      • malfist 2 hours ago
        What purpose does logging your lifting with chatgpt achieve?
        • cj 1 hour ago
          It’s an incredible tool for weightlifting. I use it all the time to analyze my workout logs that I copy/paste from Apple Notes.

          Example prompts:

          - “Modify my Push #2 routine to avoid aggravating my rotator cuff”

          - “Summarize my progression over the past 2 months. What lifts are progressing and which are lagging? Suggest how to optimize training”

          - “Are my legs hamstring or glute dominant? How should I adjust training”

          - “Critique my training program and suggest optimizations”

          That said, I would never log directly in ChatGPT since chats still feel ephemeral. Always log outside of ChatGPT and copy/paste the logs when needed for context.

          • throwup238 17 minutes ago
            You can also export to CSV and use that file in the chat if you’re using a tracking app like Hevy.
          • dnpls 1 hour ago
            That's brilliant. I have an injury for a while now, and I change my routine on the fly at the gym, depending on whether I still feel pain or not. Much better if I change it before the next time I go, so I don't waste time figuring out what to replace.
            • cmgbhm 1 hour ago
              I did this planning with Gemini and track in Google Sheets (really stinks for mobile)

              Cardio goals, current FTP, days to train, injuries to avoid

              3 lift day programs with tracking 8w progressive Loop my PT into warm ups

              Alternate suggestions.

              Use whole sheet to get an overview of how the last 8w went and then change things up

        • serf 1 hour ago
          presumably the same thing that logging anything with an LLM achieves : plain language into structured text quickly.
  • stuckkeys 3 minutes ago
    Time stamps? lol They still don’t have the option to search your previous history. Luckily I built an extension that stores all chats locally to a database so I can reference and view offline if I want too. Time stamps included.
  • bravetraveler 2 hours ago
    Surely an intern over there can prompt a toggle/hover event
  • journal 1 hour ago
    You only need that info if you know you need it in your rag. Over the last two years of usage I don't recall where I'd need those timestamps but I know there are cases. Still, this would have to be an option because otherwise it would be waste of tokens. However, we have to consider they are competing for the quality AND length of the response even if a shorter response is better. There's a pretzel of considerations when talking about this.
    • gukoff 1 hour ago
      Timestamps are conversation metadata and don't need to be fed to the LLM and require tokens.
    • cj 1 hour ago
      Imagine you started having back pain months ago and you remember asking ChatGPT questions when it first started.

      Now you’re going to the doctor and you forgot exactly when the pain started. You remember that you asked ChatGPT about the pain the day it started.

      So you look for the chat, and discover there are no dates. It feels like such an obvious thing that’s missing.

      Let’s not over complicate things. There aren’t that many considerations. It’s just a date. It doesn’t need to be stuffed into the context of the chat. Not sure why quality or length of chat would need to be affected?

  • phyzix5761 1 hour ago
    Is it possible they're reusing responses which are close enough by some factor? Maybe this is why exposing a timestamp won't be beneficial for them.
  • callamdelaney 1 hour ago
    They also don’t support code formatting of inputs. You’d think after 3 years out whatever theyd have resolved that.
  • abadar 2 hours ago
    I built a single page website that copies the current time to my clipboard and I paste it into my messages. It's inconvenient and I don't do it irregularly.

    I'll have to look into the extension described in the link. Thank you for sharing. It's nice to know it's a shared problem.

  • throw03172019 1 hour ago
    New startup idea: ChatGPT but with timestamps. $100M series A
  • tomComb 2 hours ago
    You can see a chat timestamp when it shows up as a search result.

    I’m not suggesting this is sufficient, I’m just noting there is somewhere in the user interface that it is displayed.

    • firesteelrain 2 hours ago
      There is something wrong with the time embedded/hidden. I don’t show it as accurate at all. Maybe they are using it for some other reason
  • submeta 17 minutes ago
    Beyond the lack of timestamps, ChatGPT produces oddly formatted text when you copy answers. It’s neither proper markdown nor rich text. The formatting is consistently off: excessive newlines between paragraphs, strangely indented lists, and no markdown support whatsoever.

    I regularly use multiple LLM services including Claude, ChatGPT, and Gemini, among others. ChatGPT’s output has the most unusual formatting of them all. I’ve resorted to passing answers through another LLM just to get proper formatting.

  • mv4 2 hours ago
    Other than the potential liability, cost may also be a factor.

    Back in April 2025, Altman mentioned people saying "thank you" was adding “tens of millions of dollars” to their infra costs. Wondering if adding per-message timestamps would cost even more.

    • mikkupikku 8 minutes ago
      Altman was being dumb; being polite to LLMs makes them produce higher quality results which results in less back-and-forth, saving money in the long run.
    • cj 2 hours ago
      Presumably you could decouple timestamps from inference.

      I would be very surprised if they don’t already store date/time metadata. If they do, it’s just a matter of exposing it.

    • g947o 1 hour ago
      I think "thank you" are used for inference in follow-up messages, but not necessarily timestamps.

      I just asked ChatGPT this:

      > Suppose ChatGPT does not currently store the timestamp of each message in conversations internally at all. Based on public numbers/estimates, calculate how much money it will cost OpenAI per year to display the timestamp information in every message, considering storage/bandwidth etc

      The answer it gave was $40K-$50K. I am too dumb and inexperienced to go through everything and verify if it makes sense, but anyone who knows better is welcome to fact check this.

    • nacozarina 1 hour ago
      it’s wild ppl accept his rhetoric at face value
    • stainablesteel 1 hour ago
      this is actually hilarious, also easily fixable if they just respond to that with a pre-determined

      if response == 'thank you': print('your welcome')

  • kingforaday 2 hours ago
    Just like on a piece of hardware that doesn't have a RTC, we rely on NTP. Maybe we just need an NTP MCP for the agents. Looks like there are several open-source projects already but I'm not linking to them because I don't know their quality or trust.
  • chasing0entropy 3 hours ago
    It's ugly, why it isn't at least exposed as an option to enable for power users would make me look at some advantage time stamps would give to an inference scraper or possibly their service APIs don't have contemporaneous access to the metadata available from the web interface.
  • stainablesteel 1 hour ago
    may as well make a model stamp too, to remember which one was responding
  • tom1337 2 hours ago
    What annoys me even more is that ChatGPT doesn't alert you, when you near the context window limit. I have a chat which I've worked on for a year and now hit the context window. I've worked around this by doing a GDPR download of all messages, re-constructed the conversation inside a markdown file and then gave that file to claude to create a summarized / compacted version of that chat...
  • roschdal 2 hours ago
    I have had enough of this Evil AI. Never again.
  • bobse 3 hours ago
    [dead]
  • PunchTornado 2 hours ago
    Surprised that people still use chatgpt
    • serf 1 hour ago
      having been a customer of Anthropic and Google at varying times, it's not surprising to me in the least.

      As the companies sprint towards AGI as the goal the floor for acceptable customer service has never been lower. These two concepts are not unrelated.

      • bobse 5 minutes ago
        [dead]
    • baby 2 hours ago
      Personally I use all of them all the time and chatgpt is still on top
      • andai 2 hours ago
        Could you elaborate on your experience with the different ones? What you use them for and how they compare. Thanks
    • logicallee 2 hours ago
      what do you use?
      • andai 2 hours ago
        For conversational use, which is the main way these things are used, I personally found Claude to be the best.

        Claude Sonnet is my favorite, despite occasionally going into absurd levels of enthusiasm.

        Opus is... Very moody and ambiguous. Maybe that helps with complex or creative tasks. For conversational use I have found it to be a bit of a downer.

      • fatata123 2 hours ago
        [dead]