[Home]DoYouRatOnYourFriend

ec2-52-14-168-56.us-east-2.compute.amazonaws.com | ToothyWiki | RecentChanges | Login | Webcomic

In short:
Yes you should.  But you're doomed.  Because your friend will follow the same logic you do, and end up ratting you out - so you should at least take him down with you.

Eh?  On what logic?  If he's ratted on you, you're dead.  Doesn't really matter beyond that.  If he hasn't ratted on you, why kill him?  - Kazuhiko

Because if you rat on him and he's already ratted on you, you both get off with a lighter sentence.  If you rat and he doesn't, you get off scot free.  M-A
Oh yeah - sorry, got my original problem statement slightly wrong.  The scoring requires the following ranks: you rat he doesn't is best for you and worst for him, no-one rats is second best, both rat is next and worst is he rats you don't.  The gaps between these ranks are usually exagerated to point out that normal MinMax? play aims for the best, which is only slightly better than the second place, misses the (easy to attain with trust) second case and ends up at the worst case. -- Vitenka

Things get MUCH more interesting with repeated trials.

Since I'm filling bits out, may as well add that TitForTat does pretty damn well - though it isn't optimal.  TheSelfishGene gives a good explanation of what happens next, and why you end up needing a strategy for a strategy for a strategy for a strategy for a strategy for a strategy for a strategy for a strategy for a strategy ... for choosing the strategy you will play with.  I call Meta? ;)  -- Vitenka

I once wrote my own evolution type simulation for this. It settled down to 'cooperate if previous results were the same, defect otherwise' - from all starting conditions (including total defection). I also tried depth two hindsight, but results were not amenable to soundbite summary.


It would be interesting if biological costs could be simulated.  Let the players pay points to buy special abilities like "reduce the damage taken from being ratted upon" (armour), "increase the damage you do when you rat on them" (being vindictive), etc.  Add in costs for features used by strategies (like you have to pay for a good enough memory to be able to refer to the last X events).

Suppose we have 8 players.  In the first round each player gets to put 8 contestants into a pool. During each round pairs of contestants are picked at random from the pool.  This continues until all contestants have been picked on average 24 times each.  The least successful 32 die off.  The next 8 carry on.  The next 16 carry on and breed once.  The top 8 carry on and breed twice. The next round then starts with this new pool.

What sort of state do people think would be reached by this?  Would one strategy conquer or would an ecology of strategies be stable? --DR
Inquisitor did something similar, quite extensively, a few years back. Except much less drastic - a group of 10 or so would be picked, from a pool of a couple of hundred, and would then compete. Most successful breeding, to replace the least successful. Even then, it settled on a single strategy (he thinks - he'd need to rerun it to check there wasn't something more subtle going on).


Supplementary question: If you could encode a strategy as a series of binary decisions (a strand of 'DNA'), such that two strategies could be merged to produce a third syntactically well formed strategy of similar nature, would this be more or less likely to produce a varied  ecology? (Assume that '#1 breeds with #2, 3 with 4, etc.) --DR

So for example, suppose the costs are:
A defects, B defects: A gets  20, B gets  20
A defects, B loyalty: A gets 100, B gets  0
A loyalty, B defects: A gets  0, B gets 100
A loyalty, B loyalty: A gets  80, B gets  80

At the start of a round A can buy armour at a cost of 10 per armour point (to a max of 16).
Thus if A bought 6 points of armour, the table would look like:
A defects, B defects: A gets  26, B gets  20
A defects, B loyalty: A gets 100, B gets  0
A loyalty, B defects: A gets  6, B gets 100
A loyalty, B loyalty: A gets  80, B gets  80

Weapons can also be bought, for just 5 per weapon point.  B having a point of weapon cancels A having a point of armour.

The amount of weapon and or armour to buy might be encoded into 4 bits in a number of ways.

encoding 1 - Bits 1 and 2 encode how many points in total to spend.  Bits 3 and 4 encode the armour to weapon balance.

encoding 2 - Bits 1 and 2 encode how much armour to get (0,4,8 or 16 points).  Bits 3 and 4 encode how much weapon to get (0,4,8 or 16 points).

encoding 3 - Bit 1 is 8 armour, bit 2 is a further 8 armour, bit 3 is 8 weapon, bit 4 is 8 weapon.

encoding 4 - bit 1 is pacifist override.  if set, nothing bought.  bit 2 is, if 1 is not set, full war (16 in each).  If neither 1 or 2 are set, bit 3 and 4 give access to 4 milder strategic options. (Hmm, possibility of dominant alleles?)

Which encoding pattern would make for most effective sexual reproduction?

ec2-52-14-168-56.us-east-2.compute.amazonaws.com | ToothyWiki | RecentChanges | Login | Webcomic
This page is read-only | View other revisions | Recently used referrers
Last edited October 15, 2004 1:41 am (viewing revision 12, which is the newest) (diff)
Search: