admin管理员组

文章数量:1414621

In Firefox it's possible using the following...

HTMLDocument.prototype.__defineGetter__("cookie",function (){return "foo=bar";});
HTMLDocument.prototype.__defineSetter__("cookie",function (){});

This doesn't cause any errors in WebKit, and WebKit definitely supports __defineGetter__ and __defineSetter__, but it doesn't work. Guessing WebKit is protecting that property somehow.

So, any ideas of how to achieve the same effect in WebKit?

In Firefox it's possible using the following...

HTMLDocument.prototype.__defineGetter__("cookie",function (){return "foo=bar";});
HTMLDocument.prototype.__defineSetter__("cookie",function (){});

This doesn't cause any errors in WebKit, and WebKit definitely supports __defineGetter__ and __defineSetter__, but it doesn't work. Guessing WebKit is protecting that property somehow.

So, any ideas of how to achieve the same effect in WebKit?

Share Improve this question asked Jun 14, 2009 at 8:47 netnicholsnetnichols 2513 silver badges4 bronze badges
Add a ment  | 

1 Answer 1

Reset to default 6

Have you tried do define the getter/setter pair on the document object itself, instead on the prototype?

document.__defineGetter__("cookie", function() {} );
document.__defineSetter__("cookie", function() {} );

I know it shouldn't matter, but I don't underestimate browser quirks. Even with WebKit.

Update

I've done some tests in Chrome 2 and it appears that it only allows defining a setter. Anyway, I'm not sure how valid this observation is to WebKit, as Google Chrome uses a different JavaScript interpreter than WebKit.

本文标签: javascriptIs it possible to override documentcookie in WebKitStack Overflow