It used to require certain models have a “kill switch” but this was so controversial lobbyist got it out. Models that are trained using over 10^26 FLOP have to go undergo safety certification, but I think there is a pretty large amount of confusion about what this entails. Also peeps are liable if someone else fine tunes a model you release.
Does anyone know what’s inside that bill? I’ve seen it thrown around but never with any concretes.
It used to require certain models have a “kill switch” but this was so controversial lobbyist got it out. Models that are trained using over 10^26 FLOP have to go undergo safety certification, but I think there is a pretty large amount of confusion about what this entails. Also peeps are liable if someone else fine tunes a model you release.
init = RandomUniform(minval=0.0, maxval=1.0) layer = Dense(3, kernel_initializer=init)
pls do not fine tune this to create the norment nexus :(
There’s also whistleblower protections (<- good, imo fuck these shady ass companies)