cross-posted from: https://beehaw.org/post/12170575

The GDPR has some rules that require data controllers to be fair and transparent. EDPB guidelines further clarify in detail what fairness and transparency entails. As far as I can tell, what I am reading strongly implies a need for source code to be released in situations where an application is directly executed by a data subject and the application also processes personal data.

I might expand on this more but I’m looking for information about whether this legal theory has been analyzed or tested. If anyone knows of related court opinions rulings, or even some NGO’s analysis on this topic I would greatly appreciate a reference.

#askFedi

  • debanqued@beehaw.orgOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    9 months ago

    Glad to get a response. I was starving for feedback.

    I probed into a closed-source Android app that’s state-endorsed which collects sensitive personal info (and a touch of Art.9 data), and contains undisclosed trackers. Not only is it closed-source but the license prohibits reverse engineering, thus pro-actively blocking data subjects from understanding how their data is processed, consequently breaking a lot of transparency and fairness guidelines. I will expose this in greater detail eventually.

    AFAICT, the GDPR just vaguely says be transparent and be fair. The EDPB published a couple lengthy guidelines covering what that means. CSS seems to quite starkly violate many of those guidelines. EDPB guidelines are not legally binding. But GDPR Art.5(2) places the burden on data controllers to prove fair and transparent processing:

    The controller shall be responsible for, and be able to demonstrate compliance with, paragraph 1 (‘accountability’).

    The EU and UK will change the law long before they ban closed source software.

    Perhaps, but the discussion should happen because I’m sure they need to draw lines. IMO it’s inevitable. It would likely be argued that “software code is far from being a simple plain understandable language to most people”, but the sensible compromise that I can imagine being reasonably forced is: publish the code, or publish a detailed statement of everything that app does with personal data. And if the latter turns out to dominate that’s still a big stride from today’s reality. As it stands right now transparency and fairness is a complete joke as soon as user-executed closed source apps come into play. It’s a glaring loophole.

    It is just not economically viable for them to interact with the rest of the world.

    Worth noting that the GDPR does not imply the need for all 4 software freedoms… just the code inspection freedom. I’m not convinced that would have any notable harm on the economy.

    I also cannot imagine a ban on reverse-engineering prohibition clauses having a noteworthy negative impact on the economy. It would be an injustice to not have a rule or guideline that says effectively “data subjects have a right to reverse engineer apps that process their data”.

    • HumanPenguin
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      I would agree failing to be open about how data is used. Is a breach. But that is also the argument that will be used to argue code dosent need to be open source.

      Lets look at it from a legal equivalent rather then a technical one.

      Growing drugs in your house is illegal. But the law still protects your right to privacy in your home. Police cannot search your home without going evidence of you growing drugs. And in most cases the need to prove that to a judge. ( yeah we will ignore how far away from most the police have gotten )

      The right of a dishware company to protect its code is and likely will always be treated as important.

      Hell your right to protect your code and choose to make it open or closed source. Is the very ideal the GPL and other OS licences depend on to force other using your code to treat it as you demand. So a law removing that choice from you. May well do more harm then good.

      The problem is like many when it comes to law. Making code transparent is not designedvto ensure the transparency of data use. It is designed to make prooving people are breaking the law easier.

      Unfortunately I feel protecting the right of developers or house holders. Is way more important then reducing the cost to society of prooving someone is stealing data or growing pot.

      Society has a duty to invest in its enforcement of law. And when someone dose as you have done and traced down the actions of a scummy app. It is the job of our societies law enforcement to take your report. And investigate your evidence. Before forcing the company to let them inspect the attic.