Package Gnumed :: Package pycommon :: Module gmBusinessDBObject
[frames] | no frames]

Source Code for Module Gnumed.pycommon.gmBusinessDBObject

  1  """GNUmed database object business class. 
  2   
  3  Overview 
  4  -------- 
  5  This class wraps a source relation (table, view) which 
  6  represents an entity that makes immediate business sense 
  7  such as a vaccination or a medical document. In many if 
  8  not most cases this source relation is a denormalizing 
  9  view. The data in that view will in most cases, however, 
 10  originate from several normalized tables. One instance 
 11  of this class represents one row of said source relation. 
 12   
 13  Note, however, that this class does not *always* simply 
 14  wrap a single table or view. It can also encompass several 
 15  relations (views, tables, sequences etc) that taken together 
 16  form an object meaningful to *business* logic. 
 17   
 18  Initialization 
 19  -------------- 
 20  There are two ways to initialize an instance with values. 
 21  One way is to pass a "primary key equivalent" object into 
 22  __init__(). Refetch_payload() will then pull the data from 
 23  the backend. Another way would be to fetch the data outside 
 24  the instance and pass it in via the <row> argument. In that 
 25  case the instance will not initially connect to the databse 
 26  which may offer a great boost to performance. 
 27   
 28  Values API 
 29  ---------- 
 30  Field values are cached for later access. They can be accessed 
 31  by a dictionary API, eg: 
 32   
 33          old_value = object['field'] 
 34          object['field'] = new_value 
 35   
 36  The field names correspond to the respective column names 
 37  in the "main" source relation. Accessing non-existant field 
 38  names will raise an error, so does trying to set fields not 
 39  listed in self.__class__._updatable_fields. To actually 
 40  store updated values in the database one must explicitly 
 41  call save_payload(). 
 42   
 43  The class will in many cases be enhanced by accessors to 
 44  related data that is not directly part of the business 
 45  object itself but are closely related, such as codes 
 46  linked to a clinical narrative entry (eg a diagnosis). Such 
 47  accessors in most cases start with get_*. Related setters 
 48  start with set_*. The values can be accessed via the 
 49  object['field'] syntax, too, but they will be cached 
 50  independantly. 
 51   
 52  Concurrency handling 
 53  -------------------- 
 54  GnuMed connections always run transactions in isolation level 
 55  "serializable". This prevents transactions happening at the 
 56  *very same time* to overwrite each other's data. All but one 
 57  of them will abort with a concurrency error (eg if a 
 58  transaction runs a select-for-update later than another one 
 59  it will hang until the first transaction ends. Then it will 
 60  succeed or fail depending on what the first transaction 
 61  did). This is standard transactional behaviour. 
 62   
 63  However, another transaction may have updated our row 
 64  between the time we first fetched the data and the time we 
 65  start the update transaction. This is noticed by getting the 
 66  XMIN system column for the row when initially fetching the 
 67  data and using that value as a where condition value when 
 68  updating the row later. If the row had been updated (xmin 
 69  changed) or deleted (primary key disappeared) in the 
 70  meantime the update will touch zero rows (as no row with 
 71  both PK and XMIN matching is found) even if the query itself 
 72  syntactically succeeds. 
 73   
 74  When detecting a change in a row due to XMIN being different 
 75  one needs to be careful how to represent that to the user. 
 76  The row may simply have changed but it also might have been 
 77  deleted and a completely new and unrelated row which happens 
 78  to have the same primary key might have been created ! This 
 79  row might relate to a totally different context (eg. patient, 
 80  episode, encounter). 
 81   
 82  One can offer all the data to the user: 
 83   
 84  self.original_payload 
 85  - contains the data at the last successful refetch 
 86   
 87  self.modified_payload 
 88  - contains the modified payload just before the last 
 89    failure of save_payload() - IOW what is currently 
 90    in the database 
 91   
 92  self._payload 
 93  - contains the currently active payload which may or 
 94    may not contain changes 
 95   
 96  For discussion on this see the thread starting at: 
 97   
 98          http://archives.postgresql.org/pgsql-general/2004-10/msg01352.php 
 99   
100  and here 
101   
102          http://groups.google.com/group/pgsql.general/browse_thread/thread/e3566ba76173d0bf/6cf3c243a86d9233 
103          (google for "XMIN semantic at peril") 
104   
105  Problem cases with XMIN: 
106   
107  1) not unlikely 
108  - a very old row is read with XMIN 
109  - vacuum comes along and sets XMIN to FrozenTransactionId 
110    - now XMIN changed but the row actually didn't ! 
111  - an update with "... where xmin = old_xmin ..." fails 
112    although there is no need to fail 
113   
114  2) quite unlikely 
115  - a row is read with XMIN 
116  - a long time passes 
117  - the original XMIN gets frozen to FrozenTransactionId 
118  - another writer comes along and changes the row 
119  - incidentally the exact same old row gets the old XMIN *again* 
120    - now XMIN is (again) the same but the data changed ! 
121  - a later update fails to detect the concurrent change !! 
122   
123  TODO: 
124  The solution is to use our own column for optimistic locking 
125  which gets updated by an AFTER UPDATE trigger. 
126  """ 
127  #============================================================ 
128  __version__ = "$Revision: 1.60 $" 
129  __author__ = "K.Hilbert <Karsten.Hilbert@gmx.net>" 
130  __license__ = "GPL" 
131   
132  import sys, copy, types, inspect, logging, datetime 
133   
134   
135  if __name__ == '__main__': 
136          sys.path.insert(0, '../../') 
137  from Gnumed.pycommon import gmExceptions, gmPG2 
138   
139   
140  _log = logging.getLogger('gm.db') 
141  _log.info(__version__) 
142  #============================================================ 
143 -class cBusinessDBObject(object):
144 """Represents business objects in the database. 145 146 Rules: 147 - instances ARE ASSUMED TO EXIST in the database 148 - PK construction (aPK_obj): DOES verify its existence on instantiation 149 (fetching data fails) 150 - Row construction (row): allowed by using a dict of pairs 151 field name: field value (PERFORMANCE improvement) 152 - does NOT verify FK target existence 153 - does NOT create new entries in the database 154 - does NOT lazy-fetch fields on access 155 156 Class scope SQL commands and variables: 157 158 <_cmd_fetch_payload> 159 - must return exactly one row 160 - where clause argument values are expected 161 in self.pk_obj (taken from __init__(aPK_obj)) 162 - must return xmin of all rows that _cmds_store_payload 163 will be updating, so views must support the xmin columns 164 of their underlying tables 165 166 <_cmds_store_payload> 167 - one or multiple "update ... set ... where xmin_* = ..." statements 168 which actually update the database from the data in self._payload, 169 - the last query must refetch the XMIN values needed to detect 170 concurrent updates, their field names had better be the same as 171 in _cmd_fetch_payload 172 173 <_updatable_fields> 174 - a list of fields available for update via object['field'] 175 176 """ 177 #--------------------------------------------------------
178 - def __init__(self, aPK_obj=None, row=None):
179 """Init business object. 180 181 Call from child classes: 182 183 super(cChildClass, self).__init__(aPK_obj = aPK_obj, row = row) 184 """ 185 # initialize those "too early" because checking descendants might 186 # fail which will then call __str__ in stack trace logging if --debug 187 # was given which in turn needs those instance variables 188 self.pk_obj = '<uninitialized>' 189 self._idx = {} 190 self._payload = [] # the cache for backend object values (mainly table fields) 191 self._ext_cache = {} # the cache for extended method's results 192 self._is_modified = False 193 194 # check descendants 195 self.__class__._cmd_fetch_payload 196 self.__class__._cmds_store_payload 197 self.__class__._updatable_fields 198 199 if aPK_obj is not None: 200 self.__init_from_pk(aPK_obj=aPK_obj) 201 else: 202 self._init_from_row_data(row=row) 203 204 self._is_modified = False
205 #--------------------------------------------------------
206 - def __init_from_pk(self, aPK_obj=None):
207 """Creates a new clinical item instance by its PK. 208 209 aPK_obj can be: 210 - a simple value 211 * the primary key WHERE condition must be 212 a simple column 213 - a dictionary of values 214 * the primary key where condition must be a 215 subselect consuming the dict and producing 216 the single-value primary key 217 """ 218 self.pk_obj = aPK_obj 219 result = self.refetch_payload() 220 if result is True: 221 self.original_payload = {} 222 for field in self._idx.keys(): 223 self.original_payload[field] = self._payload[self._idx[field]] 224 return True 225 226 if result is False: 227 raise gmExceptions.ConstructorError, "[%s:%s]: error loading instance" % (self.__class__.__name__, self.pk_obj)
228 #--------------------------------------------------------
229 - def _init_from_row_data(self, row=None):
230 """Creates a new clinical item instance given its fields. 231 232 row must be a dict with the fields: 233 - pk_field: the name of the primary key field 234 - idx: a dict mapping field names to position 235 - data: the field values in a list (as returned by 236 cursor.fetchone() in the DB-API) 237 238 row = {'data': row, 'idx': idx, 'pk_field': 'the PK column name'} 239 240 rows, idx = gmPG2.run_ro_queries(queries = [{'cmd': cmd, 'args': args}], get_col_idx = True) 241 objects = [ cChildClass(row = {'data': r, 'idx': idx, 'pk_field': 'the PK column name'}) for r in rows ] 242 """ 243 try: 244 self._idx = row['idx'] 245 self._payload = row['data'] 246 self.pk_obj = self._payload[self._idx[row['pk_field']]] 247 except: 248 _log.exception('faulty <row> argument structure: %s' % row) 249 raise gmExceptions.ConstructorError, "[%s:??]: error loading instance from row data" % self.__class__.__name__ 250 251 if len(self._idx.keys()) != len(self._payload): 252 _log.critical('field index vs. payload length mismatch: %s field names vs. %s fields' % (len(self._idx.keys()), len(self._payload))) 253 _log.critical('faulty <row> argument structure: %s' % row) 254 raise gmExceptions.ConstructorError, "[%s:??]: error loading instance from row data" % self.__class__.__name__ 255 256 self.original_payload = {} 257 for field in self._idx.keys(): 258 self.original_payload[field] = self._payload[self._idx[field]]
259 #--------------------------------------------------------
260 - def __del__(self):
261 if self.__dict__.has_key('_is_modified'): 262 if self._is_modified: 263 _log.critical('[%s:%s]: loosing payload changes' % (self.__class__.__name__, self.pk_obj)) 264 _log.debug('original: %s' % self.original_payload) 265 _log.debug('modified: %s' % self._payload)
266 #--------------------------------------------------------
267 - def __str__(self):
268 tmp = [] 269 try: 270 for attr in self._idx.keys(): 271 if self._payload[self._idx[attr]] is None: 272 tmp.append(u'%s: NULL' % attr) 273 else: 274 tmp.append('%s: >>%s<<' % (attr, self._payload[self._idx[attr]])) 275 return '[%s:%s]: %s' % (self.__class__.__name__, self.pk_obj, str(tmp)) 276 except: 277 return 'nascent [%s @ %s], cannot show payload and primary key' %(self.__class__.__name__, id(self))
278 #--------------------------------------------------------
279 - def __getitem__(self, attribute):
280 # use try: except: as it is faster and we want this as fast as possible 281 282 # 1) backend payload cache 283 try: 284 return self._payload[self._idx[attribute]] 285 except KeyError: 286 pass 287 288 # 2) extension method results ... 289 getter = getattr(self, 'get_%s' % attribute, None) 290 if not callable(getter): 291 _log.warning('[%s]: no attribute [%s]' % (self.__class__.__name__, attribute)) 292 _log.warning('[%s]: valid attributes: %s' % (self.__class__.__name__, str(self._idx.keys()))) 293 _log.warning('[%s]: no getter method [get_%s]' % (self.__class__.__name__, attribute)) 294 methods = filter(lambda x: x[0].startswith('get_'), inspect.getmembers(self, inspect.ismethod)) 295 _log.warning('[%s]: valid getter methods: %s' % (self.__class__.__name__, str(methods))) 296 raise gmExceptions.NoSuchBusinessObjectAttributeError, '[%s]: cannot access [%s]' % (self.__class__.__name__, attribute) 297 298 self._ext_cache[attribute] = getter() 299 return self._ext_cache[attribute]
300 #--------------------------------------------------------
301 - def __setitem__(self, attribute, value):
302 303 # 1) backend payload cache 304 if attribute in self.__class__._updatable_fields: 305 try: 306 if self._payload[self._idx[attribute]] != value: 307 self._payload[self._idx[attribute]] = value 308 self._is_modified = True 309 return 310 except KeyError: 311 _log.warning('[%s]: cannot set attribute <%s> despite marked settable' % (self.__class__.__name__, attribute)) 312 _log.warning('[%s]: supposedly settable attributes: %s' % (self.__class__.__name__, str(self.__class__._updatable_fields))) 313 raise gmExceptions.NoSuchBusinessObjectAttributeError, '[%s]: cannot access [%s]' % (self.__class__.__name__, attribute) 314 315 # 2) setters providing extensions 316 if hasattr(self, 'set_%s' % attribute): 317 setter = getattr(self, "set_%s" % attribute) 318 if not callable(setter): 319 raise gmExceptions.NoSuchBusinessObjectAttributeError, '[%s] setter [set_%s] not callable' % (self.__class__.__name__, attribute) 320 try: 321 del self._ext_cache[attribute] 322 except KeyError: 323 pass 324 if type(value) is types.TupleType: 325 if setter(*value): 326 self._is_modified = True 327 return 328 raise gmExceptions.BusinessObjectAttributeNotSettableError, '[%s]: setter [%s] failed for [%s]' % (self.__class__.__name__, setter, value) 329 if setter(value): 330 self._is_modified = True 331 return 332 333 # 3) don't know what to do with <attribute> 334 _log.error('[%s]: cannot find attribute <%s> or setter method [set_%s]' % (self.__class__.__name__, attribute, attribute)) 335 _log.warning('[%s]: settable attributes: %s' % (self.__class__.__name__, str(self.__class__._updatable_fields))) 336 methods = filter(lambda x: x[0].startswith('set_'), inspect.getmembers(self, inspect.ismethod)) 337 _log.warning('[%s]: valid setter methods: %s' % (self.__class__.__name__, str(methods))) 338 raise gmExceptions.BusinessObjectAttributeNotSettableError, '[%s]: cannot set [%s]' % (self.__class__.__name__, attribute)
339 #-------------------------------------------------------- 340 # external API 341 #--------------------------------------------------------
342 - def same_payload(self, another_object=None):
343 raise NotImplementedError('comparison between [%s] and [%s] not implemented' % (self, another_object))
344 #--------------------------------------------------------
345 - def is_modified(self):
346 return self._is_modified
347 #--------------------------------------------------------
348 - def get_fields(self):
349 try: 350 return self._idx.keys() 351 except AttributeError: 352 return 'nascent [%s @ %s], cannot return keys' %(self.__class__.__name__, id(self))
353 #--------------------------------------------------------
354 - def get_updatable_fields(self):
355 return self.__class__._updatable_fields
356 #--------------------------------------------------------
357 - def get_patient(self):
358 _log.error('[%s:%s]: forgot to override get_patient()' % (self.__class__.__name__, self.pk_obj)) 359 return None
360 #--------------------------------------------------------
361 - def refetch_payload(self, ignore_changes=False):
362 """Fetch field values from backend. 363 """ 364 if self._is_modified: 365 if ignore_changes: 366 _log.critical('[%s:%s]: loosing payload changes' % (self.__class__.__name__, self.pk_obj)) 367 _log.debug('original: %s' % self.original_payload) 368 _log.debug('modified: %s' % self._payload) 369 else: 370 _log.critical('[%s:%s]: cannot reload, payload changed' % (self.__class__.__name__, self.pk_obj)) 371 return False 372 373 if type(self.pk_obj) == types.DictType: 374 arg = self.pk_obj 375 else: 376 arg = [self.pk_obj] 377 rows, self._idx = gmPG2.run_ro_queries ( 378 queries = [{'cmd': self.__class__._cmd_fetch_payload, 'args': arg}], 379 get_col_idx = True 380 ) 381 if len(rows) == 0: 382 _log.error('[%s:%s]: no such instance' % (self.__class__.__name__, self.pk_obj)) 383 return False 384 self._payload = rows[0] 385 return True
386 #--------------------------------------------------------
387 - def __noop(self):
388 pass
389 #--------------------------------------------------------
390 - def save(self, conn=None):
391 return self.save_payload(conn = conn)
392 #--------------------------------------------------------
393 - def save_payload(self, conn=None):
394 """Store updated values (if any) in database. 395 396 Optionally accepts a pre-existing connection 397 - returns a tuple (<True|False>, <data>) 398 - True: success 399 - False: an error occurred 400 * data is (error, message) 401 * for error meanings see gmPG2.run_rw_queries() 402 """ 403 if not self._is_modified: 404 return (True, None) 405 406 args = {} 407 for field in self._idx.keys(): 408 args[field] = self._payload[self._idx[field]] 409 self.modified_payload = args 410 411 close_conn = self.__noop 412 if conn is None: 413 conn = gmPG2.get_connection(readonly=False) 414 close_conn = conn.close 415 416 # query succeeded but failed to find the row to lock 417 # because another transaction committed an UPDATE or 418 # DELETE *before* we attempted to lock it ... 419 # FIXME: this can fail if savepoints are used since subtransactions change the xmin/xmax ... 420 421 queries = [] 422 for query in self.__class__._cmds_store_payload: 423 queries.append({'cmd': query, 'args': args}) 424 rows, idx = gmPG2.run_rw_queries ( 425 link_obj = conn, 426 queries = queries, 427 return_data = True, 428 get_col_idx = True 429 ) 430 431 # update cached XMIN values (should be in first-and-only result row of last query) 432 row = rows[0] 433 for key in idx: 434 try: 435 self._payload[self._idx[key]] = row[idx[key]] 436 except KeyError: 437 conn.rollback() 438 close_conn() 439 _log.error('[%s:%s]: cannot update instance, XMIN refetch key mismatch on [%s]' % (self.__class__.__name__, self.pk_obj, key)) 440 _log.error('payload keys: %s' % str(self._idx)) 441 _log.error('XMIN refetch keys: %s' % str(idx)) 442 _log.error(args) 443 raise 444 445 conn.commit() 446 close_conn() 447 448 self._is_modified = False 449 # update to new "original" payload 450 self.original_payload = {} 451 for field in self._idx.keys(): 452 self.original_payload[field] = self._payload[self._idx[field]] 453 454 return (True, None)
455 456 #============================================================
457 -def jsonclasshintify(obj):
458 # this should eventually be somewhere else 459 """ turn the data into a list of dicts, adding "class hints". 460 all objects get turned into dictionaries which the other end 461 will interpret as "object", via the __jsonclass__ hint, 462 as specified by the JSONRPC protocol standard. 463 """ 464 if isinstance(obj, list): 465 return map(jsonclasshintify, obj) 466 elif isinstance(obj, gmPG2.dbapi.tz.FixedOffsetTimezone): 467 # this will get decoded as "from jsonobjproxy import {clsname}" 468 # at the remote (client) end. 469 res = {'__jsonclass__': ["jsonobjproxy.FixedOffsetTimezone"]} 470 res['name'] = obj._name 471 res['offset'] = jsonclasshintify(obj._offset) 472 return res 473 elif isinstance(obj, datetime.timedelta): 474 # this will get decoded as "from jsonobjproxy import {clsname}" 475 # at the remote (client) end. 476 res = {'__jsonclass__': ["jsonobjproxy.TimeDelta"]} 477 res['days'] = obj.days 478 res['seconds'] = obj.seconds 479 res['microseconds'] = obj.microseconds 480 return res 481 elif isinstance(obj, datetime.time): 482 # this will get decoded as "from jsonobjproxy import {clsname}" 483 # at the remote (client) end. 484 res = {'__jsonclass__': ["jsonobjproxy.Time"]} 485 res['hour'] = obj.hour 486 res['minute'] = obj.minute 487 res['second'] = obj.second 488 res['microsecond'] = obj.microsecond 489 res['tzinfo'] = jsonclasshintify(obj.tzinfo) 490 return res 491 elif isinstance(obj, datetime.datetime): 492 # this will get decoded as "from jsonobjproxy import {clsname}" 493 # at the remote (client) end. 494 res = {'__jsonclass__': ["jsonobjproxy.DateTime"]} 495 res['year'] = obj.year 496 res['month'] = obj.month 497 res['day'] = obj.day 498 res['hour'] = obj.hour 499 res['minute'] = obj.minute 500 res['second'] = obj.second 501 res['microsecond'] = obj.microsecond 502 res['tzinfo'] = jsonclasshintify(obj.tzinfo) 503 return res 504 elif isinstance(obj, cBusinessDBObject): 505 # this will get decoded as "from jsonobjproxy import {clsname}" 506 # at the remote (client) end. 507 res = {'__jsonclass__': ["jsonobjproxy.%s" % obj.__class__.__name__]} 508 for k in obj.get_fields(): 509 t = jsonclasshintify(obj[k]) 510 res[k] = t 511 print "props", res, dir(obj) 512 for attribute in dir(obj): 513 if not attribute.startswith("get_"): 514 continue 515 k = attribute[4:] 516 if res.has_key(k): 517 continue 518 getter = getattr(obj, attribute, None) 519 if callable(getter): 520 res[k] = jsonclasshintify(getter()) 521 return res 522 return obj
523 524 #============================================================ 525 if __name__ == '__main__': 526 527 if len(sys.argv) < 2: 528 sys.exit() 529 530 if sys.argv[1] != u'test': 531 sys.exit() 532 533 #--------------------------------------------------------
534 - class cTestObj(cBusinessDBObject):
535 _cmd_fetch_payload = None 536 _cmds_store_payload = None 537 _updatable_fields = [] 538 #----------------------------------------------------
539 - def get_something(self):
540 pass
541 #----------------------------------------------------
542 - def set_something(self):
543 pass
544 #-------------------------------------------------------- 545 from Gnumed.pycommon import gmI18N 546 gmI18N.activate_locale() 547 gmI18N.install_domain() 548 549 data = { 550 'pk_field': 'bogus_pk', 551 'idx': {'bogus_pk': 0, 'bogus_field': 1}, 552 'data': [-1, 'bogus_data'] 553 } 554 obj = cTestObj(row=data) 555 #print obj['wrong_field'] 556 #print jsonclasshintify(obj) 557 obj['wrong_field'] = 1 558 559 #============================================================ 560