mercredi 26 février 2020

Design pattern for an interface whose implementations behave differently if they are part of a range

I have a base class that looks something like this

class Object {
public:
  virtual void print() = 0;
  virtual void set(std::string) = 0;
  virtual void delete() = 0;
};

Instances of Object are meant to exhibit certain behavior, like being able to set some data, print the data set in it, and delete the resources taken by the node (please ignore the fact that this could be done with destructors for now). Calling any of the other two functions after calling delete() causes an exception to be thrown. These functions are safe to invoke under concurrent workloads (with reads being trivially safe, while writes need to be serialized via a mutex, a write thread, or something similar)

Then I have an implementation of this interface that is exposed only as part of an ordered data structure - say a linked list. However, there is one behavior difference with the description of the base class above - when we call delete on a node, it deletes not only the node, but all nodes before it.

Is there a particular type of design pattern I can use to show the difference in behavior of the delete() function? I could for example, say in the comments that instances can spontaneously get deleted themselves, and that users need external synchronization (eg. from the linked-list implementation), but not sure if that's the best solution (maybe some sort of capability query is appropriate here?).

Aucun commentaire:

Enregistrer un commentaire