Right, so in your cyclic lists the "list head" is a special list node that has the same data type as all the other nodes but doesn't contain useful payload data? No, I was thinking of the simpler case in which all nodes in the list are real nodes containing real payload, and head is not a node but just a pointer.
I don't think your approach would even have occurred to me, and it certainly wasn't the version used by the actual list of elephants I encountered in the incident that gave rise to this entire thought process. The whole point of them using a cyclic list was that they wanted to be able to go round and round the list very efficiently, so they made advancing to the next (real) element a trivial operation requiring no test. But if a dummy list-head node appears at some point in the list, that advantage is lost, because now advancing to the next element looks something like
elephant = elephant->next;
if (elephant == head)
elephant = elephant->next;
which is no more convenient than doing the obvious thing with a linear list
elephant = elephant->next;
if (!elephant)
elephant = head;
Mmm. The doubly-linked cyclic list model is almost canonical for implementing a list that supports bidirectional traversal with arbitrary insertions and removals (there's a sneakier version where one instead stores in each node the difference between the addresses of its previous and next nodes — this saves a word in each node, but complicates updates). There, treating the head/tail as the next element after the last one and the previous element before the first one gives considerable orthogonality and symmetry benefits, making the code simpler and more efficient. A nicety particularly relevant is that the empty list ceases to be a special case for anything. The objective is emphatically not to allow one to iterate through the list's head/tail in some kind of wrap-around.
The data structure you're looking at seems much more specialised and peculiar; I'm left wondering what its purpose is, and what operations it's expected to permit, with what complexity order. Particularly, any kind of insertion or removal is fiddly, but if the structure is essentially static, why not just use contiguous memory with modular arithmetic?
Oh, I don't think we're talking about complexity order. We're right down in the realm of cycle-shaving here, making the constant-time operations happen in smaller or larger constant amounts of time.
If you're down to cycle-shaving, surely your for loop spells disaster? That the list is cyclic is not so intrinsic that the optimiser can spot it and the null check in elephant ? elephant != head : (elephant = head) != NULL will have to be performed each time round the loop.
I know what you mean, but once I got my head round the conversation, Simon's description made sense to me: if you want to traverse the circle more often than you want to do anything which involves treating the head=null case specially, then the circular implementation without distinguished elements is conceptually simpler, and gives you the freedom to write extremely optimised assembly later, at any point it's needed, even if in the meantime you only have a simple loop.
headis not a node but just a pointer.I don't think your approach would even have occurred to me, and it certainly wasn't the version used by the actual list of elephants I encountered in the incident that gave rise to this entire thought process. The whole point of them using a cyclic list was that they wanted to be able to go round and round the list very efficiently, so they made advancing to the next (real) element a trivial operation requiring no test. But if a dummy list-head node appears at some point in the list, that advantage is lost, because now advancing to the next element looks something like
elephant = elephant->next; if (elephant == head) elephant = elephant->next;which is no more convenient than doing the obvious thing with a linear listelephant = elephant->next; if (!elephant) elephant = head;The data structure you're looking at seems much more specialised and peculiar; I'm left wondering what its purpose is, and what operations it's expected to permit, with what complexity order. Particularly, any kind of insertion or removal is fiddly, but if the structure is essentially static, why not just use contiguous memory with modular arithmetic?