BPM.com
  1. Peter Schooff
  2. Sherlock Holmes
  3. BPM Discussions
  4. Thursday, 26 October 2017
  5.  Subscribe via email
From Ian Gotts: How do you think the GDPR, which is the regulation by which the European Parliament, the Council of the European Union and the European Commission intend to strengthen and unify data protection for all individuals within the European Union, will impact processes?
Accepted Answer Pending Moderation
Sure. The best way to implement GDPR is to consider it as an BPM application. See [1]

Thanks,
AS
References
  1. http://improving-bpm-systems.blogspot.ch/2017/06/gdpr-as-bpm-application.html
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Accepted Answer Pending Moderation
More work for more compliance for everybody, yes it will impact everything including processes.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Accepted Answer Pending Moderation
GDPR will increase consumer trust; in turn this means that any GDPR-compliant process or app is likely to have better adoption. Interestingly this 2017 Accenture strategy document GDPR: A Slice Of PII, With A Side Of Digital Trust also claims that, along with the benefits of trustworthiness, organizational data management costs may be reduced as a consequence of having to comply. Of course implementing GDPR will be technically difficult; no doubt there will be headlines . . .
Comment
To the extent that John's premise is correct, and I hope it is, the correlation between increased trust and broader adoption is incredibly important. Thanks for pointing that out.
@John/Scott

Seems to me the recent Equifax scandal has set back "consumer trust" a couple of decades.

As for the USA WH, who needs to worry about the Russians? - WH cannot even keep it's confidential phone conversations confidential
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Accepted Answer Pending Moderation
Cannot comment on GDPR but fines in the USA for inadvertent or willful disclosure of what they call healthcare PHI (protected health information) are more than sufficient to cause vendors to make sure they are not central to such disclosures.

One fine and the vendor (absent proper insurance, which is expensive) would have to shut down.

The approach our group as taken for machine-machine data exchange is to outboard from the Case Management/BPMs a data exchange engine that sets up an e-hub accommodating any number of publishers and subscribers.

Publishers push, subscribers typically pull (owing to the complexity of linking to say, 100 different subscribers, each of which wants a different subset of a publisher's data, each of which wants to use their own data element naming conventions and each of which wants their own data transport format).

So, publisher P1 has a-b-c-d to share and calls this "addr, city, country, phone". P2 wants to see "location, ct, address, ph" whereas P3 only wants three (3) data element values "residence phone, street address, cty".

Publishers reasonably want to publish their data once, not generate a "package" for each subscriber. Clearly, you need formatters and parsers.

For users who want interactive services - these log into a portal - they get to see a menu of services (which can be restricted by role), they can submit a request that goes to an engine. The engine alone is able to access/ log into the back-end dbms. It consolidates responses and the responses go back to the portal. The portal user never knows the ID of the record of interest, actually, they don't even know the name of the server or where the server is.

The weak link is the user at the portal who may have given themselves a password called "password" - enforcing strong passwords means they need to have LastPass on their devices, otherwise they will write down the password on a sticker and attach that to the smartphone.

We try to push dual -factor (which is no help when the pin is 0000).

We do have pre-processing rules that go like this (you are asking about patient John Doe, he is not one of your patients, - the request gets re-routed to admin for voice contact/clarification).

Don't put too much reliance on de-identification - hackers can connect-the-dots and figure things out that most folks would consider to be "impossible" to figure out.
References
  1. http://https:\\kwkeirstead.wordpress.com
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Accepted Answer Pending Moderation
GDPR will force marketing and sales to start documenting what they do, and then following it. No longer can they be the wild wild west.

The 3 challenges are

Job#1 is "find where all the data is stored and work out what is Personal Data" . This is a non-trivial tasks for most organizations. Throw away what is not required, or you shouldn't have. Then get opt-in consent for the rest

Job#2 is create/document the marketing and sales processes, most of which are manual, taking into account the new GDPR requirements. Many cannot be automated. Add the new GDPR processes.

Job#3 (and this is the difficult bit). Educate the staff on what they can / cannot do anymore.
Comment
  1. John Morris
  2. 1 year ago
  3. #4687
Interesting twist @Ian to focus esp. on sales - with the idea that sales info and processes need to be (I think you are implying) more closely governed. As you say it's the wild west.
Intriguingly you also mention "personal data" and so now we see the elephant in the room - sales data is also personal data, as in the data that reps carry in their heads or personal notebooks!
It's universal behaviour (and little studied or managed). Demands that reps dump the contents of their heads into CRM usually amounts to a demand for free labour and the indenture of your children (regardless of the legitimate interest of the vendor in sales infornation). These are unsolved issues.
GDPR may be a great thing but your note on the impact of GDPR on sales opens a can of lovely worms indeed.
  1. more than a month ago
  2. BPM Discussions
  3. # 5
Accepted Answer Pending Moderation
Agree this will undoubtedly require detailed understanding of all surrounding processes. Delivery of the full audit trail who did what when and controlled access to sensitive data will give the required assurance The applications being presented in a readily understood process map will aid achievement of required compliance. However as indicated those use of "note books" will be a challenge as users see their knowledge being absorbed into the structured system. However the trade off should be greater empowerment with real time feed back of activity increasing productivity yet satisfying compliance.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Accepted Answer Pending Moderation
Compliance enhancing projects often have a tendency to be overly costly while having little to no tangible benefit - at least not at a first glance - for the end user and its customers. Vendors in that sense sometimes take advantage of a short time hype cycle, fed by a new piece of legislation (example: FATCA) or important adverse events (example EQUIFAX), to hastily repackage and reprice existing software suites, going aggressively after anxious and unprepared companies.
On the other hand, a well thought through process design approach with an accompanying architecture, will not only align to new regulations such as the GDPR but also improve the overall customer experience. This example in particular, aiming for a Euro wide, unified rules set on the level of data protection, does hold the grand potential of also having a unifying effect on the underlying business processes. That in turn would provide the end customers, besides with enhanced security, with more transparency and businesses with benchmarking frameworks.
A true effect on technologies, processes and procedures, however, will ultimately depend on the economical impact of penalties for noncompliance, minimizing the effects of moral hazards.
In parallel, its important to take into account all the voluntary data exposing initiatives from individuals and business alike, like human micro chipping, DNA mapping and nationwide inventorying etc., and analyze realistically how effective any data protection regulation can really be.
NSI Soluciones - ABPMP PTY
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 7
Accepted Answer Pending Moderation
Fortunately, the GDPR requires "privacy by design and privacy by default" thus only explicit architecting and rearchitecing of enterprise systems can achieve this requirement. Reactive documenting and audit trails storing are not enough.

Thanks,
AS
Comment
@Alexander . . Makes sense. The Civerex products have three access mechanisms a) log-in, b) portal log-in, and c) data import/export streams (bulk or 'trickle' transactions).

If you are a "regular" user, you have a user/pass and this lets you into the system. For healthcare (to set the terminology), with [a] you have a "role" that details where, in the app, you may go and what processing you can engage. You have an InTray. BPM posts process steps as these become current and for any patient requiring services from a member of that role to all members of that role. Users view/record data, the system (no option here) automatically posts to the History, a date and timestamp and user "signature' plus data, as it was, at the time it was recorded, on the form that was in service for that task at that time. Once in, the data cannot be changed but a user can insert an ad hoc process of one step, see the data come forward, carry out edits and the History now has a "today/now" revision. We commit to history as well any visits to a patient record\step\form where no data was recorded (i.e. so that we can later on respond to "who accessed John Doe's diagnosis screen last August 15th?". Clicking on Commit at a process step also automatically exports any data that you, as a publisher, want to share, but the default is NO sharing with any subscriber

With [b], you log into a portal via a user/pass as for [a] - you have an InTray, tasks post but the difference here is you are at arm's length with the app. All communication goes to a separate server, to an engine. The engine alone is able to log in, establish a cursor position in the RDBMS, read/write and push back out to the portal, any data relating to the record "owner" i..e the patient. As with [a] data goes to the History and data is exported. The user does not know where the back-end server is, they don't know the ID of the patient, they cannot establish a cursor position at any patient record. they cannot insert ad hoc steps, they cannot skip over steps, re-visit completed steps, but they can get to a menu of services where requests may be entertained (depends on rules) - customers, suppliers, and contractors typically are the users in [b].

For [c], any prospective subscriber to data exported to the data exchanger must come forward, cap in hand, and demonstrate why they want this data or that data. They must provide their "name" for each data element because we recognize that each subscriber (an organization, not an individual) typically wants to receive data using their own data element naming convention. Organizations can post data to the data exchanger, they can read data they been granted access to, we keep a record of the data they were offered and we track the cursor position in the data exchanger where they last read. We track when they have read the data. Pre-processing rules are in place to sniff out bad incoming data when a subscriber posts data for import to our app.

We have not been able to think of any required additional user type above and beyond a,b,c.
  1. more than a month ago
  2. BPM Discussions
  3. # 8
  • Page :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.