ProM Tips – Which Log Filters Should You Use? — Flux Capacitor
ProM is a sophisticated analysis platform; using it completely is not straightforward. This blog provides a very handy guidance on filters.
ProM is a sophisticated analysis platform; using it completely is not straightforward. This blog provides a very handy guidance on filters.
This article is mainly a collection of opinions and hints about the tools available for book publishing.
Is a popular service providing authors with several ways to write and publish online paid books.
There is someone who wants to move forward:
I will cover them in further revisions of this article:
I recently found a really provocative and interesting article about the current status and trends for ERP software applications.
Without any doubt, many customers of the “Big ERP” vendors, feel the weight of complex, monolithic systems that have piled – over the years – an impressive amount of investments. Despite this situation, I believe that there are now some factors that will, soon or later, spark a completely new generation of solutions for middle, big, end very big business.
If I look at my professional history, maybe the initial growth of the main ERP solutions (SAP, JD Edwards, PeopleSoft, Oracle, etc.) was based on two, technological, disruptive factors:
– relational databases,
– availability of long-range data connections
The first one allowed to build scalable applications, able to process quickly vast amounts of data (“vast” in the scale of ’70 ’80 of the previous century).
The second one allowed Companies to dismiss the multitude of “local”, strongly customized, applications, and concentrate to a single, standard, application platform accessed remotely from all the plants, sales, offices, local branches, etc.
Today, we have, at hand, some new technologies – first of all unstructured databases (NoSQL) and strong transactional databases (Blockchain), plus cloud, service orientation, Big Data, and all the commercial hype that vendors are eager to promote.
The big – and emerging – ERP producers are reacting adding new, and fancier, user intefaces, additional modules, but, in my opinion, are failing on some key points. They are adding stuff on top of existing applications, leaving the underlying architecture untouched.
Furthermore, they failed to recognise the growing importance of the “personal information” aspect of document processing. This lead to attention on the “presentation” features of user intefaces, rather than on the “content” of the interface. there are still strong barriers dividing the enterprise side of the information from the personal side. This require, to the user, frequent context switches.
So I tried to imagine some features for a “New ERP” software architecture.
If you are reading my post, you can easily share it with someone else, simply passing him the link to the browser page (or using the “share” button embedded in all the mobile apps). Your friend will receive your share via mail, or some other sort of generalized messaging or notification system.
If you are watching at an invoice on your ERP system, you cannot do this. Maybe you can, if your colleague operates on the same software and he is logged on same server.
In my dream ERP, every customer order, GL posting, article master data, purchasing request should be available as a single page (web or mobile doesn’t matter) with a simple link (obviously requesting all the authentication and permission checking stuff).
All the documents (EDI, pdf, etc.), mails, chats, comments, records should be available in the same “space”.
They may be physically dispersed in several locations or cloud systems, but their index should be unique, exactly as the index of a search engine is unique, even if indexes millions of different servers.
Please, note that, for me, email should be a totally integrated feature. This mean that the mail server should be a module of the system, and should be able to tag each incoming mail with the relevant references, analyzing the semantic meaning of the message body and attachments.
Microsoft, for example, is pushing and claiming it’s messaging app Outlook integration with it’s NAV ERP solution.
The fastest way to find something should be the search box.
You type “Invoice june 2016 ACME”, and you get the list of all the invoices issued in the month of june to/from ACME. But ALSO the mail that you have exchanged with the salse rep about this invoice, maybe a report where this invoice is listed.
You will be able to browse and refine your search.
The key point is that ANYTHING should be indexed, not only the “keys” recorded on transactions. If there is a customer order with a description containing “provided by ACME Corp.”, it should be in the result list, even if the customer/supplier is not ACME.
Search should go beyond content searching; I think that also menu items, user guides, how-tos, should be searchable in the same way. So, if you have to post a new lease-out contract, simply type “new lease-out contract”: the result list will show you the link to the active page where you can post the contract, as well as a link to the guide, maybe also an alert, telling you that – from the first of july – “lease-out contracts should be posted under a new category”, or something else.
You will be able to save anything in your favourites, be it a document, a menu item, a mail.
Social ERP doesn’t mean a bad Facebook clone with a different name and your Company directory preloaded.
It means that all the documents/records that you process, your comments on them, your approvals or rejections, will be part of a content stream, categorized under many keys (“Project X”, “Lead Y”, “incoming invoices”, “maintenance requests”) cleverly assigned to each item.
So, if you have a purchasing request that is approved by your boss, you will read this event in your main stream. But if your boss has some issue on it, he will simply annotate the request, you will receive this notification, and will be able to see it in the context of the request, not as a separate mail.
And a mail, coming from a vendor, will become a document, within the stream related to your purchase requisition. Your collegues will be able to comment it, and, if properly configured, each comment will become a response mail to the original author.
5. Workflow and capabilities
Workflow automation is a great tool, when you follow a course about process modelling, or you watch at a demo. Why business executives dislike process models? Because they are a mess. And they are a mess not because your organisation is poorly designed, or is overwhelmengly complex, but because reality is flexible, fuzzy, and it has to be so.
A modern ERP should incorporate Workflow management, but only as a “main” process path, keeping track of milestone approvals, allowing a flexible “I will take charge of this” pattern. The system should provide you with a clear view of the process pipeline, highlight exceptions, and allow a flexible collaboration on each item.
6. Related content
It means that you should be able to find, and link, anything that you find that is related to the content that you are viewing.
People working on items, tend to manage them mainly as a chronological sequence, all the other forms of archiving are useful, but not “natural”. Then, when they read an item, the association mental mechanism happens, and their memory suggest the existence of related content. This should be naturally implemented in the system, thus simply enhancing the natural power of the user’s mind.
Related contents may include documents that have a functional relationship (an order with a goods receipt, a VAT posting with an invoice) that will be managed by the system, as well as mail, memos, spreadsheets, drawings, that can be useful for evaluation, knowledge sharing, and the else.
Localization is always an issue. ERP vendors are requested to maintain some functions (eg. VAT, Withholding tax, property tax), or to develop some functions only for one country. This has a cost, a huge cost, and customers pay this cost, in term of maintenance fees. I live in Italy, where the regulator is often tricky, and new requirements, fiscal reports, blossom every year, and ERP vendors often don’t provide a timely, simple, and fully effective solution. Maybe, the fact that the local functions are developed offshore, by people who haven’t ever a “field” experience of our normative reality, is a factor influencing the final quality.
In my vision, a localized application for VAT will be developed only for Italy (or India, Argentina, USA…) by a local software developer, with strong ties and knowledge of the evolving regulatory landscape. When you post an invoice, you will generate several, related (see point 6), different documents:
– the invoice itself, with metadata describing only the essential information (date, number, vendor, total amount, customer);
– the general ledger document, with only account codes and amounts;
– the VAT document;
– the Withholding tax document, if required;
– the financial (account payable cash flow) document;
– the good-receipt clearing document.
This means that:
– the application footprint is small;
– user interfaces require a careful design (but this is already true for the legacy, monolithic, applications);
– different document are linked only by a “structured” link, made by the document url, and few integrity constraints based on document values
I listed only few elements that I believe will characterize the future generation of ERP systems, and are made possible by the technologies that emerged in the past years. I am convinced that a new software development approach, conceiving the application as something mixed or meshed with common use tools, like mail, document systems, social networks, will provide agility, ease of management, and – at the end – an impact on the bottom line of P&L!
Some of my thoughts are shared by key players in the ERP software industry (look, for example, this interview with the SAP Fiori guru). But I haven’t seen yet a full conceptual design for a “New ERP”.
The news are here.
This is only one of the thousands of articles about this deal. I would like to share some first-hour thoughts by the perspective of the Enterprise Social Networks software market.
Four years ago, Microsoft have placed a bet, acquiring Yammer, one of the leading products in the ESN market. The strategy appeared clear:
Today, this move appears only as a partial success.
So, why Linkedin? When acquisitions like this happens, I prefer to look at the small technical innovations rather than at the Corporate-level marketing announcements.
I am expecting some of the following (in the short/medium term):
Will this be a success? Surely this kind of integration may boost the adoption of Office 365 and cloud document services (more than Yammer), just like the (virtually) free diffusione of Word/Excel at the early times of Windows was a great move to beat the competition in the office automation software market.
There is also a tremendous potential from the possible integration of Outlook/Yammer/Linkedin as a unified business communication platform. But this require a complete redesign of the product strategy.
On the negative side there is the current level of “pollution” in Linkedin, emerging as aggressive and “massive” marketing campaigns, some not-so-professional usage patterns (dating), a decrease in the quality of posts (too much marketing, too much trivial quotes), and – most of all – a decrease in the volume/quality of recruiting, one of the main assets of Linkedin.
There is a lot of attention, in the software development industry, upon software reuse. It’s a good practice, also if, often, reusing “old” software, may be a compromise solution.
Now, I want to share my ideas about “reusing” user interfaces.
In my Company, we manage an ERP system. Our business architecture is quite complex, and several processes are managed with the ERP system. System users are mainly “expert” users, people who post transactions, interact with the system, during all the workday.
There is also a set of “non-expert” users: field technicians, managers, salespeople. These people follow different interaction paths: they require information, but are not aware of “codes”, “transactions”, “modules”. At the same time they should interact with the system, but only in a limited way: an approval, a feedback, etc.
Asking them to login to the system, learn how it works, its internal logic, may be hard. At the same time, each user may consume a “seat licence”. This means additional licensing and maintenance fees.
There are many solutions for this common issue: enterprise portals, or web based applications (for purchase requisitions approval, time sheets, service reports,and so on).
But, still, all these apps require that the user adopt a new “channel” to interact with the system. This means a user interface, application logic, etc. With the term “channel”, I mean the aggregate of user interface and human machine interaction paths, or execution environment. Every time that a user has to “switch” from a channel to another (consider leaving your email client and opening a spreadsheet), he will spend an amount of mental energy only because the environment changes, and his brain have to “recall” the knowledge about the new channel. This amount of energy increases if the new channel is used with much less frequency than the previous one; eventually this may cause a “rejection” of the second channel.
We tried something simple, nothing new, but a comprehensive approach, leveraging the opportunities offered by an integrated cloud service. We tried to use a number of well known “channels”, like email and intranet, to deliver some simple interactions with our ERP system.
Today, quite every business operate some basic services to support collaboration:
Every user is able to interact with these systems.
My Company have migrated, few years ago, those services on a cloud platform, a project that I have managed. It was a good move; anyhow you may find a lot of documents and discussions about this kind of solutions, googling around.
But the key advantage of an integrated, cloud-based, collaboration services solution, is search.
The search engine paradigm is one of the easiest pattern for user interface: you need something, you type in a box what you need, the system returns a list of results. You can browse through results, searching what you need, and, eventually, finding something useful that you weren’t aware of.
Al, happens in a fraction of a second. Cloud-based search engines are really powerful.
I have imagined that leveraging this feature, and using the email as a “transactional” interface, may lower the ERP adoption barrier for “non-expert” users.
So, we started to architect and develop some simple software modules, following three main patterns:
For the first pattern, we used document sharing or intranet; for the other two, email, or web-based forms in some cases.
For example, if a mall manager wants to open a purchasing request, he uses a web form. The systems then creates a shared folder, that will be used to store RFP, Vendor offerings, budget status reports, and all the documents required for the approval process.
The organizational units involved will receive an email, with the notice of the new request. The central purchasing administration office will create a request in the ERP system, linking it with the shared folder.
The ERP system will start publishing some documents, about the request, in the shared folder: budget status, potential vendors list, etc.
Just imagine. Users not directly involved with data entry and transactions on the ERP system are able to find all the information, searching their inbox, or their file collection. They may not be aware that an ERP system is working behind the scenes. All the documents are linked together, just as usual for web pages. The search engine can be used to find everything, typing the name of the vendor, or a project description, or a tenant name.
We have applied this approach to several processes. The final result is that a user can simply search his mail/file/intranet archive, freely, typing just few keywords, and find what he needs.
The training required was near to zero.
Now, few notes for the techies.
Creating a communication link between an ERP server buried in a closed, protected, subnetwork, and a cloud service (which usually is protected in an insane way) is not easy.
Luckily our requirements weren’t so strict, no “real-time” interaction was required.
We built our interface on two widely used standards: SMTP and JSON. They are not specific of any platform, they are vendor neutral, and well supported on both our subsystems. So:
Some scripts running on the cloud platform do the processing required for publishing the data embedded in JSON files. We developed a simple template engine, to speed up the graphical/layout design phase.
In this way we are able to manage the ERP-to-cloud channel.
To get back Data, from the cloud, toward the ERP, we still use an email, with some control codes embedded in it, delegating to a small java application the task of retrieving mail messages, parsing their content, and performing a remote function call on the ERP (this can be developed directly within the ERP platform, but our current release doesn’t support the required APIs). Open source tools like Gradle and Maven provided a convenient support for packaging the runtime artifacts and retrieve/align the required libraries.
Developing everything was quite easy: both systems provide simple conversions from/to JSON. Both systems have a development environment supporting interfaces for email sending/processing. Both systems have templating engines, critical to achieve “presentation” components that are easy to maintain.
With this approach, we have a certain degree of independence from each of the connected subsystems.
Clearly, we have used a specific ERP and a specific cloud solution, but I don’t want to make reference to a single product, because this kind of approach can be easily followed using any of the main products available in each segment. All the “my product is clearly the best” marketing claims overestimate the differences existing among different architectures. I think that a professional ERP manager or solution architect can obtain quite the same result working on any platform.
A recent, provocative, post on linkedin, posed some issues about the Archimate standard.
One of my first doubts, using introducing Archimate in our Architecture Capability, was: “why was it created as a separate graphical language and not as an UML profile?”. This doubt is still in my mind, in my previous post about this topic, I have inserted some links about this argument. The “UML profile” solution appears as natural, because a lot of classifiers from UML are available – with the same semantic and graphical notation – also in Archimate.
What are the consequences of a “separate” graphical model?
(+) dedicated modeling tools
(-) multiple skills/tools/repositories
(-) requirements for model integration
On the other side, I think that we should remember that Archimate and UML have different “meanings”:
Also if you look at building design, you will find an “Architectural” project, showing you the shape, internal space allocation, external appearance, of the building, and several specialized views (structural project, electrical plant, elevators, etc.); perhaps the potential buyer of the house is not interested in how many cables or switches are necessary to operate the elevator – and could not understand this view – but is interested in the size and number of windows of the living room.
Despite this, I believe that “Archimate as UML profile” could bear the same expressiveness, the same semantic content, and a seamless integration of architectural and technical layers.
Personally, I am evaluating the introduction of Archimate, and starting to represent (maybe with better results) our architecture with this language. But, in the mean time, and waiting for the similar OMG work, I am starting do develop an UML profile with the same content of Archimate. At this point, we are working on the Business Layer. The modeling tool that we use, by Sparx Systems, allows for an easy integration on different models using UML, BPMN or Archimate, so it does not force the user to a specific adoption profile. But, in the meantime, the UML-profile approach is allowing us to obtain stakeholder-oriented documentation with a better quality, and an easier transition from the previous graphical representation standards.
At the same time, the introduction of Archimate shows its usefulness in a more formalized and rigorous architectural design practice, relaying on a number of elements that fits well with the various organizational patterns that exists in our Company.
The results – at this point – are promising, I will share them later, and introduce some of them in the second edition on my book about BPMN.
Bottom line: I believe that we need archimate, as a component of our architecture capability. Maybe, representing it as an UML profile could help its adoption.
Representing organizational structures, roles, responsibilities, and linking this model with the process model, could be a mess, especially if your Architecture model should integrate perspectives modeled according to different standards.
I expose here some ideas that we are putting in practice across our Enterprise Architecture projects, in my current Company.
The topic of model mapping is hot: you can find a lot of interesting toughts in the blogs of Gerben Weirda. At the same time, the OMG is working on a UML profile for Archimate.
I will start with Archimate, and the basic elements from the Business Layer: Actor and Role.
The following picture, shows a suggested extension:
The Role is specialized, leading to four different stereotypes:
At the same time, the Actor element is specialized:
The diagram shows the main relationships among elements. The “Process role” may be assigned to a human actor, another functional or organizational role. Anyhow, it is meaningful only n he context of execution of a single instance of one or more processes,
Why do we need those extensions? Well, mainly for ease of communication with some stakeholders. It is common wisdom that complexity in the metamodel leads to obscure resulting models (for the eyes of non-professional stakeholders). But in this case, I think that a little bit of specialization is useful:
In the following posts, I will describe how we have translated this profile in UML, and how we have linked the Business Layer with the relevant elements of BPMN models.