I know there are other ways of accomplishing that, but this might be a convenient way of doing it. I’m wondering though if Reddit is still reverting these changes?

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I’m highlighting that having the data is not enough, if you don’t find a good way to use the data to sort the trash out. Google will need to do it, not Reddit; Reddit is only handing the data over.

    Is this clear now? If you’re still struggling to understand it, refer to the context provided by the comment chain, including your own comments.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      I’m saying reddit will not ship a trashed deliverable. Guaranteed.

      Reddit will have already preprocessed for this type of data damage. This is basic data engineering and trivial to do to find events in the data and understanding timeseries of events.

      Google will be receiving data that is uncorrupted, because they’ll get data properly versioned to before the damaging event.

      If a high edit event happens on March 7th, they’ll ship march 7th - 1d. Guaranteed.

      Edit to be clear: you’re ignoring/not accepting the practice of noting high volume of edits per user as an event, and using that timestamped event as a signal of data validity.

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        I’m saying reddit will not ship a trashed deliverable. Guaranteed.

        Nobody said anything about the database being trashed. What I’m saying is that the database is expected to have data unfit for LLM training, that Google will need to sort out, and Reddit won’t do it for Google.

        Reddit will have already preprocessed for this type of data damage.

        Do you know it, or are you assuming it?

        If you know it, source it.

        If you’re assuming, stop wasting my time with shit that you make up and your “huuuuh?” babble.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          10 months ago

          I know it because I’ve worked in corporate data engineering and large data migrations and it would be abnormal to do anything else. there’s a full review of test data, a scope of work, an acceptance period, etc.

          You think reddit doesn’t know about these utilities? You think Google doesn’t?

          You need to chill out and acknowledge how an industry works. I’m sure you are convinced but your idea of things isn’t how the industry works.

          I don’t need to explain to you that the sky is blue. And I shouldn’t need to explain to you that Google isn’t going to accept a damaged product, and that reddit can or can’t do some basic querying and timeseries manipulations.

          Edit like you literally asked for a textbook.

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 months ago

            I know it because I’ve worked in corporate data migrations

            In other words: “I dun have sauce, I’m assooming, but chruuuust me lol”

            At this rate it’s safe to simply ignore your comments as noise. I’m not wasting further time with you.

            • GBU_28@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              10 months ago

              Seems like people are voting your comment as noise but whatever.

              You are trying to prove something normal ISNT happening. I’m describing normal industry behavior.

              Seems like you need to prove an abnormal sitch is occuring.

              Edit it’s like your asking for proof that they’ll build stairs with a hand rail