|AI / Machine Learning
|AI is the broader concept of machines acting in a way that we would consider “smart”. Machine Learning is a form of AI based on giving machines access to data and let them learn for themselves. Includes neural networks, deep learning, language processing. A possible application is fraud detection.
|AI Augmented Development
|Use of AI and NLP in the development environment: debugging, testing (mutation, fuzzing), generation of code/documentation, augmented coding, recommendations for refactoring, …
|Confidential computing allows an entity to do computations on data without having access to the data itself. This can be realised in a centralised way with homomorphic encryption or a trusted execution environment (TEE) or in a decentralised way with secure multiparty computation.
|An Event Driven Architecture (EDA) can offer many advantages over more traditional approaches. Events and asynchronous communication can make a system much more responsive and efficient. Moreover, the event model often better resembles the actual business data coming in.
|Natural Language Processing (NLP), part of AI, includes techniques to distil information from unstructured textual data, with the aim of using that information inside analytics algorithms. Used for text mining, sentiment analysis, entity recognition, Natural Language Generation (NLG).
|Graph Analytics is the process of investigating relational structures (i.e., relations between entities such as people, companies, addresses, …) by the use of network and graph theory. When entities include people, we talk about SNA (Social Network Analytics).
|AI for Security
|Non-traditional methods for improving analysis methods in the security technology of systems and applications (e.g., user behaviour analytics).
|In machine learning ‘by hand’, a lot of time is lost between training a model and putting it in production, to then wait for feedback for potential retraining. CD4ML (continuous delivery for ML) attempt to automate this process, working towards Adaptive AI.
|Analytics engineers provide clean data sets to end users, modelling data in a way that empowers end users to answer their own questions. Focus on transforming, testing, deploying, and documenting data. Tools: dbt, snowflake, stitch, fivetran, looker, mode, redash, columnar DBs
|API’s, to connect services within and across multiple systems, or even to 3rd parties, are becoming prevalent and push a new business model, centred around the integration of readily available data and services. They also help with loose coupling between components.
|Augmented Data Quality
|Through the addition of AI, machine learning, knowledge graphs, NLP , … in data quality tools technologies, results could be more efficient for the business.
|Back Tracking Anomalies
|Method to detect causes of data quality problems in data flows between information systems and to improve them structurally. ROI is Important and facilitates a win-win approach between institutions. To monitor the anomalies and transactions an extension to the existing DBMS has to be built.
|Big Data Processing
|Big data analytics solutions require architecture, which 1) has the calculations executed where data is stored, 2) spreads data and calculations over several nodes, and 3) uses a data warehouse architecture that makes all types of data available for analytical tools in a transparent way.
|Causal AI techniques makes it possible to understand the causes of a prediction outcome, it encompasses methods like causal Bayesians networks, causal rules, combination of symbolic and neural AI, etc.
|Applications composed of business-oriented building blocks, where these modular reusable blocks are independent one of another and can be configured by Business and IT into a solution. Main advantage is the support for agility of the business to changes while resilience should be maintained.
|Crypto-agility allows an information security system to switch to alternative cryptographic primitives and algorithms without making significant changes to the system’s infrastructure. Crypto-agility facilitates system upgrades and evolution.
|Methods and tools to access databases with heterogeneous models and to facilitate access for users using a virtual logical view.
|Knowledge Graphs relate entities in a meaningful graph structure to facilitate various processes from information retrieval to business analytics. Knowledge graphs typically integrate data from heterogeneous sources such as databases, documents, and even human input. Makes part of AI.
|Independently maintainable and deployable services, which are kept very small (hence, ‘micro-‘), make an application, or even large groups of related systems, much more flexibly scalable, and provide functional agility, which allows a system to rapidly support new business opportunities.
|The discipline of designing and building toolchains and workflows that enable self-service capabilities, by providing an integrated product most often referred to as an “Internal Developer Platform” covering the operational necessities of the entire lifecycle of an application.
|The flow of (incoming) data, and not an application’s (or CPU’s) regular control flow, govern its architecture. This is a new paradigm, sometimes even driven by new hardware, and opposes the traditional way of working with fluxes. Also known as Dataflow Architecture and related to EDA.
|Remote Identity Verification
|Remote identity verification comprises the processes tools to remotely verify someone’s identity, without the need for the person to physically present themselves to an authority.
|Some mobile apps, like WeChat and AliPay, become entire ecosystems of pluggable mini-apps. Users can greatly customise their experience within the superapp, and integration between mini-apps is much tighter than that of normal smartphone apps. Popular now in China, but may be coming here soon
|Synthetic Data is concerned with creating a fictitious dataset that mimics a real one in format, looks and statistical properties. Can be used to further minimise the need to share sensitive or protected data.
|Zero Trust Architecture
|The main concept behind zero trust is “never trust, always verify,” which means that devices should not be trusted by default, even if they are connected to a managed corporate network such as the corporate LAN and even if they were previously verified. Also known as “perimeterless security.”
|Augmented Data Science
|Augmented data science and machine learning (augmented DSML) uses artificial intelligence to help automate and assist key aspects of a DSML process. These aspects include data access and preparation, feature engineering, as well as model operationalization, model tuning and management.
|In Master Data Management, collaborative and organised management of anomalies stemming from distributed authentic sources, by their official owners.
|Compliance Automation & Rules as Code
|The (semi-)automation of compliance and compliance verification processes which currently rely on manual input. This requires one to formalise, to the extent possible, regulation and policies that trigger actions. Tightly coupled to the LegalTech concept of Rules as Code.
|Monitoring and management of performance and “system incidents” & Monitoring of data errors in real time and lineage to automatically resolve the cause (only bugs and formal causes) in the software components of the various information systems that are linked to each other
|Approach to protect sensitive data uniquely and centrally, regardless of format or location (using e.g. data anonymization or tokenisation technologies in conjunction with centralised policies and governance).
|Cyber Immune System
|A cyber immune system combines processes and technologies to increase the robustness of computer systems against any kind of failure. It builds on technologies such as AI-augmented testing, auto remediation and processes such as software supply chain security or reliability engineering.
|Information processing and content collection and delivery are placed closer to the endpoints to fix high WAN costs and unacceptable latency of the cloud. Also in context of AI solutions, edge computing becomes more relevant (ref. tinyML)
|A general way to evolve systems away from too restrictive ACID principles. Using this, and pushing it through on a business level, are the only way to keep systems evolving towards a more distributed, scalable, flexible, and maintainable lifecycle.
|Best practices coming from DevOps, applied to Operations. This, for instance, means, that all configuration is specified in files that can be maintained using version control and that are machine readable by tools to automate as many things as possible.
|Enhancement of human capabilities using technology and science. Can be very futuristic (e.g. brain implants) but intelligent glasses could be a realistic physical augmentation. Cognitive augmentation (a human’s ability to think and make better decisions) will be made possible thanks to AI.
|Living documentation actively co-evolves with code, making it constantly up-to-date without requiring separate maintenance. It lives with the code and can be automatically used by tools to generate publishable specifications. An example can be found in some forms of annotations.
|Set of techniques, tools and platforms to develop web based and platform-specific mobile applications.
|Multimedia Data Protection
|Protection of multimedia data has gained importance with social media, remote-working, but also with the development of powerful AI models. Detecting falsification is critical. For instance one should be able to detect forgery of images (e.g., faces used for biometrics).
|By designing systems to be observable from the start, it becomes easier to detect and fix unexpected problems as early in the development life cycle as possible, making it cheaper to deal with them
|Privacy by Design
|Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The European GDPR regulation incorporates privacy by design. An example of an existing methodology is LINDDUN.
|Includes automated process discovery (extracting process models from an event log from an information system), and offers also possibilities to monitor, check and improve processes. Often used in preparation of RPA and other business process initiatives (context digital transformation).
|A new way to integrate, to minimise manual work, based on having applications discover services, extracting metadata from various sources, automating the definition of processes, and automatically mapping dependencies
|Methodology and enabling tools allowing to combine data visualisation and analytics. Allows rapidly exploring, analysing, and forecasting data. This helps modelling in advanced analytics, and to make modern, interactive, self-service BI applications.
|Voice of the Citizen Applications
|Contains a number of approaches to capture and analyse explicit or non-explicit feedback from users, in order to improve the systems and remove frictions.
|Also ‘Hexagonal Architecture’: a set of architectural principles making the domain model code central to everything and dependant on no other code or framework. Other aspects of the program code can be dependant on the domain code. Gained a lot of popularity in the community recently
|Web3 – Citizen Control
|Web3 is an idea for a new iteration of the World Wide Web which incorporates concepts such as decentralisation, blockchain technologies, and token-based economics. It promises to give back control to citizens over their assets, as well as over their identity.