In order to determine whether it's a spider or not, you want to examine the HTTP_USER_AGENT that is sent with every request to your site. What you have to create is a conditional that says:
ignore cookie requirement
I did a quick search for lists of user agents, and found quite a large list here:
I have no idea whether these are the best or not - I just wanted to provide you with an example.
As to how to implement this, there's a couple ways you could do it. You could put a comprehensive list of user agents into a file, and then load the file and check against that. Doing it that way gives you the advantage of being able to update the list as often as required, and makes maintenance really quite easy.
You can also load the user-agents into an array and iterate through the array when checking.
Or the quick and easy hack is to choose 2 or 3 really important ones (googlebot and inktomi slurp come to mind) and create a longer if statement (if (googlebot || inktomi.slurp) and so on).
Oh yeah, you can check to see how this is working for you (and if it is working at all) by setting your script to exempt mozilla and then trying to load the page. If you can get in without the cookie, then you know you're set, and you can get rid of the exemption for Mozilla and off you go on you way.
Does that help at all?
Bob's quote of the week:
"world cup of hockey is starting isn't it [, Dilligaf]? Who are you cheering for dilli?"
"Well... if you promise not to tell my neighbours... "Oh Canada... ""