In the end Gerty helps Sam not because it has become self-aware, but because it is programmed to help Sam, it’s its First Law and evidently Lunar Industries forgot to add “except when this conflicts with the Second Law”, the Second Law presumably going “serve the economical interests of Lunar Industries”. It’s Asimovian. I like it. It also proves capitalism tends to forget about people and logic.
There’s an interesting difference between the final dialogue in the movie and the one in the screenplay.
This is the screenplay:
Thanks for all your help, Gerty. I wish I could say I was going to miss you, buddy, but to be honest, I can’t wait to get away from here.
I understand, Sam. I hope life is everything you remember it to be.
Thanks. Are you sure you’re going to be ok?
Of course. The new Sam and I will be back to our programming as soon as I have finished rebooting.
Gerty, I’m not programmed.
While in the movie the final line will be:
Gerty, we’re not programmed, we are people, understand?
Of course Sam is talking about the clones, but maybe he’s also talking about Gerty. In this movie there are only clones and robots. It’s a future of simulacra exiled to the moon to work, while humans live normal lives on the Earth. The Sams are definitely sentient, but Gerty is probably not, at least not yet. What Sam2 means is that humanity is about ethics and kindness and Gerty is coming very close to it, even if only though logic, simulation, and the communication process.
Oh Baudrillard would have loved this film.