cross-posted from: https://beehaw.org/post/17683690

Archived version

Download study (pdf)

GitHub, the de-facto platform for open-source software development, provides a set of social-media-like features to signal high-quality repositories. Among them, the star count is the most widely used popularity signal, but it is also at risk of being artificially inflated (i.e., faked), decreasing its value as a decision-making signal and posing a security risk to all GitHub users.

A recent paper by Cornell University published on Arxiv, the researchers present a systematic, global, and longitudinal measurement study of fake stars in GitHub: StarScout, a scalable tool able to detect anomalous starring behaviors (i.e., low activity and lockstep) across the entire GitHub metadata.

Analyzing the data collected using StarScout, they find that:

(1) fake-star-related activities have rapidly surged since 2024

(2) the user profile characteristics of fake stargazers are not distinct from average GitHub users, but many of them have highly abnormal activity patterns

(3) the majority of fake stars are used to promote short-lived malware repositories masquerading as pirating software, game cheats, or cryptocurrency bots

(4) some repositories may have acquired fake stars for growth hacking, but fake stars only have a promotion effect in the short term (i.e., less than two months) and become a burden in the long term.

The study has implications for platform moderators, open-source practitioners, and supply chain security researchers.

  • ITeeTechMonkey@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 day ago

    Does codeberg have anything that will prevent an influx of bots or AI accounts that have plagued GitHub?

    I ask because as the user base for codeberg grows the bots, AI and nefarious actors will follow.

    I like the idea of a federated source code hosting platform especially since it removes lock-in to a single corporation and a defacto monopoly.

    That in itself is a good enough reason to migrate, but regarding this particular issue, bots/AI and artificial project promotion for malicious intent, feels like re-arranging deck chairs on the Titanic.

    • mox@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      6
      ·
      1 day ago

      Once these things are federated, it seems reasonable to expect that each instance would be able to choose what stars/followers/etc it accepts or displays, roughly similar to what Lemmy does with allowed/blocked instances. That might put a dent in the problem. At least, there would no longer be a single, easy, high-value target for this sort of thing.