Winning in digital means working different

It’s All About the Customer

Digital transformation is pushing IT to do more. You’re tasked with mastering new tech, collaborating with the business on digital strategies, and recommending strategic investments for the future. How can you do more with confidence? Go with the leader in enterprise architecture management, IT portfolio management and IT planning tools.

We offer unique EA practice by assesing the fitness of application portflio against Business and technical requirements, utilising best practices such as TIME by Gartner.

Industry-analyst-recognized Alfabet helps you invest in IT wisely by managing your current IT portfolio and collaboratively planning for the future.

Palmira helps you build your Enterprise Architecture including Business Layer (Strategy ManagementBusiness Processes, Performance Management, Service Management), Application Layer, Infrastructure layer (manage your IT portfolio), and Data Layer. On top of your EA, we provide a comprehensive Governance model to manage your internal audit management and Risk Management.

“81% of companies expect to be competing mostly or completely on the basis of Customer Experience”

Gartner Customer Experience Survey, 2018

The BiZZdesign Advantage

  • The Customer-Centric Adaptive Enterprise
    Digital Twin:

      Digital Model of physical asset

    • Real-time operational data from IoT sensors
    • Enables advanced analytics, e.g:
    • o Predictive Maintenance
      o Resource Optimisation
      o Flow Control

    • Digital Model of the Enterprise (Architecture)
    • Real-time data overlay from operational systems
    • Enabling advanced analytics
    • Dependency and impact analysis
    • Bringing architecture together with operations
    • Aligning resources with strategy
    • Coordinating efforts across silos
    • Evolving EA into a “must have” capability for digital business
  • Bizzdesign 3

HoriZZon Platform Overview

Continue reading



The Cloudera foundation is built upon the Apache Hadoop framework and employs the largest group of committers under one roof. Cloudera enables organizations to capture, store, analyze and act on any data at massive speed and scale in a single data solution using Hadoop platforms.

The Cloudera foundation is built upon the Apache Hadoop framework and employs the largest group of committers under one roof. Cloudera enables organizations to capture, store, analyze and act on any data at massive speed and scale in a single data solution using Hadoop platforms.

Cloudera is being agnostic to hardware and our solutions can be optimized for both the Cloud and on-premises environments. As a result, Cloudera has a vast partner ecosystem and we pride ourselves on our solutions being highly compatible with our Customers’ existing environment and service providers. This allows for our solution to be molded to environments for a custom experience rather than wasting time and resources introducing solutions that are not compatible with the pre-existing hardware, environment or service providers that are already in place, leading to any budget being vastly depleted even before the proposed solution is installed.

Your goals to modernize the legacy systems and better harness your data is the mission we at Cloudera share. We strive to bring a comprehensive solution-set of data analytics to data anywhere the enterprise needs to work, from the Edge to AI.

By implementing an open source data platform supported by Cloudera on your own infrastructure, in the cloud or a hybrid of both, we expect you can achieve the following core benefits as we enable your Data Lake:

  1. New Efficiencies for data architecture through a significantly lower cost storage platform by leveraging the industry’s only secure enterprise-ready open source Hadoop distribution. A modern data architecture will allow you to integrate, store and process all enterprise data regardless of source, format, and type at a fraction of the cost of proprietary solutions.
  2. Capture Data in Motion in a secure, traceable way to un-tap the potential of streaming data analytics, data routing and overall seamless data ingestion from Dubai Municipality owned, or public data sources.
  3. New Opportunities, Innovation & Insights by providing data scientists, business analysts, and data developers with the ability to easily access and query all enterprise data within one environment from batch to real time using the tools they are most familiar with.

Building Futuristic Platform (AI-ML, Big data and BI)

Cloudera offers an Enetrprise Supported Full Data Lifecycle

From autonomous vehicles, to surgical robots to churn prevention and fraud detection, enterprises rely on data to uncover new insights and power world-changing solutions. It all starts with a data platform that enables you to say “yes”.

  • Yes, to the analytics your people want to use. 
  • Yes, to operating on any cloud your business requires. 
  • Yes, to the future with a cloud-native platform that flexes to meets your needs today and tomorrow.  And we have delivered.

Cloudera Data Platform is the industry’s first enterprise data cloud:

  • Multi-function analytics on a unified platform that eliminate silos & speed the discovery of data-driven insights.
  • A shared data experience that applies consistent security, governance, and metadata
  • True hybrid capability with support for public cloud, multi-cloud, Private Cloud, & on-premise deployments.

Cloudera Shared Data Experience (SDX)

CDX is the security and governance fabric that binds the enterprise data cloud. SDX enables data and metadata security and governance policies to be set once and automatically enforced across data analytics in hybrid and multi-clouds. Unlike standalone analytics software solutions or cloud services, Cloudera Data Platform with SDX delivers powerful enterprise-wide controls over data and metadata, anywhere, for ultimate infrastructure and business flexibility.

Cloudera Data Platform (CDP)

CDP is an easy, fast, and secure enterprise analytics and management platform with the following capabilities:

  • Enables ingesting, managing, and delivering of any analytics workload from Edge to AI.
  • Provides enterprise grade security and governance.
  • Provides self-service access to integrated, multi-function analytics on centrally managed and secured business data.
  • Provides a consistent experience on Public Cloud, Multi-Cloud, and Private Cloud deployments

CDP powers data-driven decision making by easily, quickly, and safely connecting and securing the entire data lifecycle. For this, data moves through a lifecycle in five distinct phases.

Palmira big data

CDP gives you complete visibility into all your data with no blind spots. The CDP control plane allows you to manage the data, infrastructure, analytics, and analytic workloads across hybrid and multi-cloud environments all with Cloudera shared experience or SDX providing consistent security and governance across the entire data lifecycle. You can manage and secure the data lifecycle in any cloud and data center with CDP.

CDP enables you to:

  • Automatically spin up workloads when needed and suspend their operation when complete thereby controlling the cloud costs.
  • Optimize workloads based on analytics and machine learning.
  • View data lineage across any cloud and transient clusters
  • Use a single pane of glass across hybrid and multi-clouds.
  • Scale to petabytes of data and 1,000s of diverse users
  • Centrally control customer and operational data across multi-cloud and hybrid environment

CDP Solution:

Cloudera CDP provides a unified platform to cost-effectively collect, store and manage unlimited volumes of any structured, semi-structured and unstructured data.

Cloudera’s Enterprise Data Hub (EDH) consists of

  • CDP (Cloudera’s Distribution including Hadoop)
  • Cloudera’s Enterprise Management, Governance and Security layer.
  • Cloudera’s DataFlow (CDF)

The above diagram explain how we deliver Develop risk management operation model, to ensure risk profiling process move from federal level to local level and vice versa without disturbing the trade in local customs department.

Data Lake as the

single point of truth

CDP is 100% Apache-licensed open source and offers unified batch processing, interactive SQL, and interactive search, and role-based access controls. More enterprises have downloaded CDP than all other such distributions combined. CDP includes the core elements of Apache Hadoop plus several additional key open-source projects that, when coupled with customer support, management, and governance through a Cloudera Enterprise subscription, can deliver an enterprise data hub.

CDP is:

  • Flexible – Store any type of data and prosecute it with an array of different computation frameworks including batch processing, interactive SQL, free text search, machine learning & statistical computation.
  • Integrated – Get up and running quickly on a complete, packaged, Hadoop platform.
  • Secure – Process and control sensitive data and facilitate multi-tenancy.
  • Scalable & Extensible – Enable a broad range of applications and scale them with your business.
  • Highly Available – Run mission-critical workloads with confidence.
  • Compatible – Extend and leverage existing IT investments.

A unified control plane to manage infrastructure, data, and analytic workloads across hybrid that already spans data and compute in on-premises HDFS (and public clouds if needed), as well as allowing FCA to implement future use cases flexibly and easily that makes delivering use cases enabling services to business as smooth as possible and in matters of minutes.

Consistent data security, governance and control that safeguards data privacy, regulatory compliance, and prevents cybersecurity threats across environments.

CDP is the industry’s first enterprise data platform. CDP delivers powerful self-service analytics across hybrid, on-premise and multi-cloud environments, along with sophisticated and granular security & governance policies that data leaders demand.

Delivered as a private cloud service, CDP includes: Machine Learning services as well as a Data Hub service for building custom business applications powered by our new Cloudera Runtime open-source distribution. Services (Machine Learning here) all accessing redhat OpenShift containerization and Kubernetes platform providing elastic capabilities to run 100’s of workloads all enabled for automatic workload performance adaptation and auto-scale in-place for each single workload

Cloudera big data

We are pleased to submit the following information to the FCA. Our solution foundation is built upon the Apache Hadoop framework and employs the largest group of committers under one roof. We enable organizations like FCA to capture, store, analyze and act on any data at massive speed and scale in a single data solution using Hadoop platforms., we pride ourselves on being agnostic to hardware and our solutions can be optimized for both the Cloud and on-premises environments. As a result, we have a vast partner ecosystem, and we pride ourselves on our solutions being highly compatible with our Customers’ existing environment and service providers. This allows for our solution to be molded to your environment for a custom experience rather than FCA wasting time and resources introducing solutions that are not compatible with the pre-existing hardware, environment or service providers that are already in place, leading to any budget being vastly depleted even before the proposed solution is installed. Your goals to modernize the legacy systems and better harness your data is the mission we at Palmira share. We strive to bring a comprehensive solution-set of data analytics to data anywhere the enterprise needs to work, from the Edge to AI.

By implementing an open-source data platform supported by Cloudera on your own infrastructure, in the cloud or a hybrid of both, we expect FCA can achieve the following core benefits as we enable your Data Lake:

  1. New Efficiencies for data architecture through a significantly lower cost storage platform by leveraging the industry’s only secure enterprise-ready open-source Hadoop distribution. A modern data architecture will allow you to integrate, store and process all enterprise data regardless of source, format, and type at a fraction of the cost of proprietary solutions.
  2. Capture Data in Motion in a secure, traceable way to un-tap the potential of streaming data analytics, data routing and overall seamless data ingestion from [insert Client] owned, or public data sources.
  3. New Opportunities, Innovation & Insights by providing data scientists, business analysts, and data developers with the ability to easily access and query all enterprise data within one.

After thoroughly studying FCA’s requirements, we recommend utilizing Cloudera for Big data and analytics and Tableau for BI requirements.

CDSW (Cloudera Data Science Workbench):

CDSW, the unique web-based GUI collaborative development tool for data scientists is already proposed as a tightly integrated component with Cloudera Platform. CDSW provides a secure environment for data scientists that integrates with Cloudera platform and provides them with access to full data available on Hadoop cluster. To avoid re-inventing the wheel, all security policies attached to Hadoop users can be inherited with few clicks on CDSW environments providing a seamless experience for data scientists. CDSW provides a flexibility in developing in multiple languages (python, Scala & R) with different versions supported for each, while a single user can have multiple projects across different languages with different permissions and privileges assigned to different users list inherited from LDAP directory integrated already with Hadoop.

cloudera big data

Cloudera Data Science Workbench is a web application that allows data scientists to use their favorite open-source libraries and languages — including R, Python, and Scala — directly in secure environments, accelerating analytics projects from exploration to production.

Built using container technology, Cloudera Data Science Workbench offers data science teams per project isolation and reproducibility, in addition to easier collaboration. It supports full authentication and access controls against data in the cluster, including complete, zero-effort Kerberos integration which means full, tight and seamless integration of Cloudera EDH users and security configuration. Add it to an existing cluster, and it just works. With Cloudera Data Science Workbench, data scientists can:

  • Use R, Python, or Scala on the cluster from a web browser, with no desktop footprint.
  • Install any library or framework within isolated project environments.
  • Directly access data in secure clusters with Spark and Impala.
  • Share insights with their team for reproducible, collaborative research.

Automate and monitor data pipelines using built-in job scheduling.

data science method

Cloudera Data Flow:

Cloudera DataFlow (CDF) is a scalable, real-time streaming data platform that collects, curates, and analyzes data so customers gain key insights for immediately actionable intelligence. It meets the challenges faced with data-in-motion, such as real-time stream processing, data provenance, and data ingestion from IoT devices and other streaming sources. Built on 100% open-source technology, CDF helps you deliver a better customer experience, boost your operational efficiency and stay ahead of the competition across all your strategic digital initiatives. CDF is very similar to HDF, and the foundations are the same. CDF is available on EDH while HDF is only available for HDP.

Recently, the whole rhythm and nature of organizations’ needs are taking different angle and started to have special requirements that go beyond just dealing with traditional and legacy systems. Use cases are becoming more complex, systems are siloed, and data sources are becoming unpredictable from nature and data format perspective. The need for batch data streaming is definitely a need for organizations to acquire data from transactional and operational systems; but it’s now a need more than any time before; to have streaming and real-time data streaming. Cloudera has been already a pioneer in the streaming area by adopting and integrating latest and most sophisticated streaming technologies with Cloudera Data Platform.

Cloudera Data Flow (CDF) is a complete portfolio that provides best of breed technologies in data in motion, streaming, IoT and data ingestion field. Based on FCA requirements, we are proposing data messaging and streaming technologies as part of this proposal. However, for any future use cases that might have IoT requirements; it is easy to procure and integrate any of the CDF portfolio to achieve the required functionalities.

Cloudera Flow Management (Apache NiFi)

Cloudera Flow Management (CFM) is a no-code data ingestion, movement and management solution powered by Apache NiFi. With NiFi’s intuitive graphical interface and 300+ processors, CFM delivers highly scalable data movement, transformation and management capabilities to the enterprise.

Apache NiFi is meant for large scale; high velocity enterprise data ingestion use cases. Primarily meant for Realtime streaming sources such as clickstreams, social streams, log data etc., Apache NiFi can handle all types of data across any type of data source. NiFi Registry, which augments NiFi, enables DevOps teams with versioning, deployment and development of flow applications.

Only red-rectangled components are proposed per CLIENT NAME requirements

Apache NiFi has an intuitive user interface for designing data flow orchestrations to for acquiring, processing and routing data from any source to any target. This is accomplished with a no-code approach to designing these flows by dragging-and-dropping pre-built processors onto the canvas and connecting them up.

Apache NiFi provides the following unique features and capabilities:

  • Intuitive visual design tool
  • Flow templates
  • Guaranteed delivery.
  • Prioritized queuing
  • Flow Specific QoS (latency v throughput, loss tolerance, etc.)
  • Data Provenance
  • Comprehensive security (Authentication and Authorization)
  • Extensible architecture
  • Site-to-site communication protocol
  • Flexible scaling model
  • Parametrization for seamless deployment of flows
  • Stateless NiFi execution mode for extremely high performance

Cloudera Stream Processing (Apache Kafka) is a streaming platform that provide the following capabilities:

  • High throughput and low latency: Kafka support hundreds of thousands of messages per-second, with latencies as low as a few milliseconds.
  • Scalability: A Kafka cluster can be elastically and transparently expanded without


  • Durability and reliability: Messages are persisted on disk and replicated within the cluster to prevent data loss.
  • Fault tolerance: The platform is immune to machine failure in the Kafka cluster.
  • High concurrency: Ability to simultaneously handle thousands of diverse clients,
  • simultaneously writing to and reading from Kafka.
  • Publish and subscribe to streams of records, similar to a message queue or enterprise messaging system.
  • Process streams of records as they occur.

Through its integration with CDF and hence CDP, you can build complete workloads within a single platform. Only Cloudera provides simple deployment and robust troubleshooting and monitoring of Kafka, as well as shared compliance-ready security, governance, lineage, and control in one simple application across multiple on-premises, hybrid, private, public, or multi-cloud environments.


Kafka Streams

Kafka Streams is the built-in stream processing library of the Apache Kafka project and provides real-time stream processing and analytics with high throughput and very low latency. It is a good fit if you are developing solely within a Kafka-to-Kafka pipeline, you don’t need or want another cluster for stream processing and analytics in the future, and operational and resilience requirements are simple or handled elsewhere. Kafka Streams enables you to perform common stream processing functions like filtering, joins, aggregations, and enrichments on the data stream. Good use cases include building lightweight microservices, straight forward ETL jobs, and simple stream analytics apps. Its mainly used for Building real-time streaming data pipelines and Building real-time streaming applications that transform or react to the streams of data.

Kafka Cruise Control

Kafka Cruise Control enables you to manage and load balance large Kafka installations. It is the solution for platform teams that need first class management services that address hard.

problems such as frequent hardware/virtual machine failures, cluster expansion/reduction, and load skew among brokers. It solves these challenges by balancing cluster intelligently and with automated anomaly detection and remediation.

While it automatically balances partitions based on user defined goals, Kafka Cruise Control also detects and actively addresses anomalies. For example, if there is a broker failure, Kafka. Cruise Control will fix the cluster by removing the failed brokers. In the case of disc failure, all the offline replicas will be moved to healthy brokers. Kafka Cruise Control is a very important component of CDF since it provides the foundation for first class Kafka Cloud Workload

Schema Registry

is an important component of the Cloudera Kafka ecosystem because it enables your teams to safely mitigate interruptions that occur due to schema mismatches. It manages, shares, and supports the evolution of all producer and consumer schemas across the Kafka landscape. You can also avoid having to attach a schema to every piece of data.

As part of CDF’s streams messaging capabilities, Schema Registry provides a shared repository of schemas that allows applications to flexibly interact with each other across the Kafka landscape by using the same schemas from end-to-end. This is particularly useful for managing data flows with schema-based routing. For example, parsing a syslog event to extract the event type, and then based on that type, route it to a downstream Kafka topic.

The screenshot below shows how you would use the Schema Registry UI to create schema groups, schema metadata, and add schema versions.

schema kafka

Figure: Cloudera Schema Registry GUI

Apache Kafka formed out of multiple components that makes it the highest throughput messaging platform:

A topic is a category or feed name to which records are published. Topics in Kafka are always multi-subscriber; that is, a topic can have zero, one, or many consumers that subscribe to the data written to it. For each topic, the Kafka cluster maintains a partitioned log that looks like this:


Each partition is an ordered, immutable sequence of records that is continually appended to—a structured commit log. The records in the partitions are each assigned a sequential id number called the offset that uniquely identifies each record within the partition.

The Kafka cluster durably persists all published records—whether or not they have been consumed—using a configurable retention period. For example, if the retention policy is set to two days, then for the two days after a record is published, it is available for consumption, after which it will be discarded to free up space. Kafka’s performance is effectively constant with respect to data size so storing data for a long time is not a problem.

Producers publish data to the topics of their choice. The producer is responsible for choosing which record to assign to which partition within the topic. This can be done in a round-robin fashion simply to balance load, or it can be done according to some semantic partition function (say based on some key in the record). 

Consumers label themselves with a consumer group name, and each record published to a topic is delivered to one consumer instance within each subscribing consumer group. Consumer instances can be in separate processes or on separate machines.

If all the consumer instances have the same consumer group, then the records will effectively be load balanced over the consumer instances.

If all the consumer instances have different consumer groups, then each record will be broadcast to all the consumer processes.

  • TIP: Cloudera Stream Processing Microservices

Cloudera Data Platform supports Microservices since early days and without the need for data platform presence as these Microservices are highly decoupled and independently scalable. Please refer to the following blog for full details on Microservices:

Open-Source Dedication and Innovation

Cloudera is dedicated to the Kafka ecosystem and continues to be actively involved with the Kafka open-source community through deep engineering relationships with other Kafka committers. This relationship has led to critical innovations and product improvements, many of which have been described here. SMM (Streams Messaging Manager) was developed because Kafka does not inherently have a user interface and so, IT teams across the enterprise. struggled to understand what went on within their Kafka clusters. We created this unified toolset as a response to what our Kafka customer most needed.

Streams Replication Manager, which is directly incorporated into Streams Messaging Manager, is another example of best-in-class engineering and innovation. We improved upon the original Kafka open-source messaging replication tool by infusing the concepts of clusters, global configuration, and global management APIs. The result is a comprehensive Kafka replication platform that not only guarantees high availability and durability across large Kafka architectures but also enables a number of other business critical use cases such as geo proximity and cloud migrations.

Figure: Cloudera Shared Data Experience

Multi Cloud and Hybrid Cloud Support

CDP is the world’s first enterprise data cloud and thus we are able to help our customers support streaming architectures that must retain an on-premises footprint for sensitive applications but, for the rest, need to leverage the cost efficiencies of public cloud providers. All of the Kafka ecosystem components can be instantly provisioned into your on-premise, private or favorite public cloud while leveraging the unified data security, governance, lineage, and control provided through CDP’s Shared Data Experience.

Security, Data Governance, and Data Lineage are First Class Citizens

The most important part of a large Kafka ecosystem is how it is pulled together, and CDP’s Shared Data Experience (SDX) is by far the biggest differentiator when compared to other Kafka platform providers. This is because data security, control policies, governance, and lineage are set once and automatically enforced on every data platform and across all components of your streaming architecture.

High Level Platform Abstraction

Another important aspect to leave off with is that the integration of CDF and CDP provides a unified platform that handles the complexities of connecting, managing, and integrating the tenets of flow management, streams messaging, and stream processing and analytics through a high level of abstraction. This empowers the business teams to suffice their share of the responsibility of delivering, in near real-time, the innovative products and services that their customers, employees, and regulators expect. Cloudera is superior because we provide a whole Kafka ecosystem that is greater than the sum of its parts.

Streams Messaging Manager (SMM)

Probably the most striking component of the Cloudera Kafka ecosystem is Cloudera Streams Messaging Manager (SMM) because it provides so much power across so many teams. SMM is a single monitoring/management dashboard that provides end-to-end visibility into how data moves across Kafka clusters between producers, brokers, topics, and consumers. It is a complete Kafka toolset that addresses the unique needs of DevOps, application development, platform operations, governance, and security teams.


Platform Operations and DevOps teams need the ability to create alerts to manage the service level agreement (SLA) of their applications. SMM provides rich alert management features for the critical components of a Kafka cluster including brokers, topics, consumers, and producers by making use of two key constructs.

  • Alert Notifier: An alert notifier tells SMM what to do when a configured alert is triggered. Out-of-the-box notifiers include sending alerts to a configured email inbox, an HTTP endpoint, or a Kafka topic to integrate alerts with other systems used across the enterprise (e.g: ticketing/ case creation systems). The user is also able to configure custom alert notifiers.
  • Alert Policy: An alert can be defined for any Kafka entity: cluster, broker, topic, producer, or consumer. A set of metrics can be selected to define a series of simple alerts while conditional operators can be used to compose complex alerts that monitor a variety of metrics across a number of entities. The alert policy is also configured with the notifier (above) when the alert fires.

As an example, the image below shows interactive visualizations that enable you to fully understand how data flows across Kafka clusters.

End-to-End Kafka Visualization

Figure: Cloudera Streams Messaging Manager Interface

Cloudera Streams Messaging

Cloudera delivers the most comprehensive streams messaging and management capabilities in the industry. It includes:

  • Latest certified, secure, and governed Apache Kafka that provides the messaging backbone.
  • Schema Registry for centralized schema management
  • Kafka Streams for real-time analytics
  • Kafka Connect for native connectivity with key data sources.
  • Cruise Control for cluster management and monitoring
  • Apache Ranger for rich access control and security
  • Streams Messaging Manager for monitoring and management of enterprise Kafka
  • Streams Replication Manager for disaster recovery and replication of enterprise Kafka clusters


Key Benefits

  • Process millions of messages per second with Apache Kafka
  • Adopt a hybrid cloud architecture for your streaming needs across any public cloud Re-use schemas, define relationship between schemas, and manage schema versions with Schema Registry.
  • Leverage the integration of Schema Registry across Kafka and Apache NiFi by using the same schemas from end-to-end.
  • Optimize and auto-scale your clusters with Cruise Control Cure “Kafka Blindness” by getting visibility into all your Kafka clusters with Streams Messaging Manager
  • Manage enterprise Kafka data effectively for active-active cluster replication and disaster recovery use cases.

Extend Monitoring/Management Capabilities with REST

The user interface is powered by first class REST services and all SMM capabilities are exposed as REST endpoints, making the product completely extensible. This is a developer and DevOps friendly way to integrate with other enterprise tools such as application performance monitoring and case/ticketing systems.

Track Data Lineage and Governance from Edge-To-Enterprise

Like other integrated components of the Cloudera DataFlow platform, SMM enjoys SDX’s unified data security and governance from edge environments across to your enterprise’s data center and cloud platforms. This includes Ranger for security and Apache Atlas for end-to-end data governance. With that, you have access to the metadata and metrics about every Kafka topic and can produce complete data lineage and audit trails, even across multiple Kafka hops.

The example below shows how a user can drill down from an edge sensor consumer (1) and launch a data lineage diagram (2) to directly see related flows across Kafka topics (3).

Figure: Kafka topics to Atlas Lineage

Integration with Schema Registry

Schema Registry, another key component of CDP, has been integrated with SMM, providing the ability to view, create and modify the schema associated with any given Kafka topic. It allows the user to define schemas for a given Kafka topic and provides the following key benefits:

  • Data Governance:

Provide reusable schema (centralized registry), define relationships between schemas (version management), and enable generic format conversion and generic routing (schema validation).

  • Operational Efficiency:

Avoid attaching schemas to every piece of data (centralized registry), enable consumers and producers to evolve at different rates (version management), and ensure data quality (schema validation).

  • Topic Lifecycle Management

SMM enables users to create, update and delete topics directly through the user interface as well as via REST services. Topics can be created as a function of availability characteristics (replication factor, minimum in-sync replicas, etc.) or with custom settings. These operations are fully integrated with Kafka Ranger policies such that only authorized users can perform these topic lifecycle management actions.


Cloudera Streaming Analytics (CSA) offers real-time stream processing and streaming analytics powered by Apache Flink. Flink implemented on CDP provides a flexible streaming solution with low latency that can scale to large throughput and state. Additionally, to Flink, CSA includes SQL Stream Builder to offer data analytical experience using SQL queries on your data streams. Key features of Cloudera Streaming Analytics

  • SQL Stream Builder
  • SQL Stream Builder is a job management interface to compose and execute Streaming SQL on streams, as well as to create durable data APIs for the results.

Cloudera Platform

Implementing Flink on the Cloudera Platform allows you to easily integrate with Runtime components and have all the advantages of cluster and service management with Cloudera Manager.

Streaming Platform

For streaming analytics, CSA fits into a complete streaming platform augmented by Apache Kafka, Schema Registry, Streams Messaging Manager in the Cloudera Runtime stack.

Supported Connectors

CSA offers Kafka, HBase, HDFS, Kudu and Hive as connectors to choose based on the requirements of your application deployment.

Monitoring Solutions

Within CSA, Kafka Metrics Reporter, Streams Messaging Manager and the reworked Flink Dashboard helps you monitor and troubleshoot your Flink applications.

Additional Frameworks

The log aggregation framework and job tester framework in CSA also enables you to create more reliable Flink applications for production.

Flink big data

Cloudera Machine Learning Overview

Machine learning has become one of the most critical capabilities for modern businesses to grow and stay competitive today. From automating internal processes to optimizing the design, creation, and marketing processes behind virtually every product consumed, ML models have permeated almost every aspect of our work and personal lives. Cloudera Machine Learning (CML) is Cloudera’s new cloud-native machine learning service, built for CDP. The CML service provisions clusters, also known as ML workspaces, that run natively on Kubernetes. Each ML workspace enable teams of data scientists to develop, test, train, and ultimately deploy machine learning models for building predictive applications all on the data under management within the enterprise data cloud. ML workspaces are ephemeral, allowing you to create and delete them on-demand. ML workspaces support fully-containerized execution of Python, R, Scala, and Spark workloads through flexible and extensible engines.

Core Capabilities:
Seamless portability across private cloud, public cloud, and hybrid cloud powered by Kubernetes Rapid cloud provisioning and autoscaling Fully containerized workloads – including Python, R, and Spark-on-Kubernetes – for scale-out data engineering and machine learning with seamless distributed dependency management High performance deep learning with distributed GPU scheduling and training Secure data access across HDFS, cloud object stores, and external databases.

The cloud offers many advantages for unpredictable and heterogeneous workloads, but there are two challenges:

  1. data is often spread across multiple clouds and on-premises systems, and
  2. existing products only cover parts of the machine learning lifecycle.

Cloudera Machine Learning directly addresses both these issues. It’s built for the agility and power of cloud computing, but isn’t limited to any one provider or data source. And it is a comprehensive platform to collaboratively build and deploy machine learning capabilities at scale. CML gives you the power to transform your business with machine learning and AI. CML users are: Data management and data science executives at large enterprises who want to empower teams to develop and deploy machine learning at scale. Data scientist developers (use open source languages like Python, R, Scala) who want fast access to compute and corporate data, the ability to work collaboratively and share, and an agile path to production model deployment. IT architects and administrators who need a scalable platform to enable data scientists in the face of shifting cloud strategies while maintaining security, governance and compliance. They can easily provision environments and enable resource scaling so they – and the teams they support – can spend less time on infrastructure and more time on innovation.

Continue reading



End to end data science for better decision making.

KNIME Analytics Platform:
KNIME Analytics Platform is the open source software for creating data science applications and services. Intuitive, open, and continuously integrating new developments, KNIME makes understanding data and designing data science workflows and reusable components accessible to everyone.

KNIME Server:
KNIME Server is the enterprise software for team based collaboration, automation, management, and deployment of data science workflows, data, and guided analytics. Non experts are given access to data science via KNIME WebPortal or can use REST APIs to integrate workflows as analytic services to applications and IoT systems.

KNIME Extensions
Open source extensions for KNIME Analytics Platform are developed and maintained by KNIME and provide additional functionalities such as access to and processing of complex data types, as well as the addition of advanced machine learning and AI algorithms.

KNIME Integrations
Open-source integrations for KNIME Analytics Platform (also developed and maintained by KNIME), provide seamless access to large open-source projects such as Keras for deep learning, H2O for high performance machine learning, Apache Spark for big data processing, Python and R for scripting, and more.

Shape your data

Derive statistics, including mean, quantiles, and standard deviation, or apply statistical tests to validate a hypothesis. Integrate dimensions reduction, correlation analysis, and more into your workflows.

Aggregate, sort, filter, and join data either on your local machine, in-database, or in distributed big data environments.

Clean data through normalisation, data type conversion, and missing value handling. Detect out of range values with outlier and anomaly detection algorithms.

Blend data from any source

Open and combine simple text formats (CSV, PDF, XLS, JSON, XML, etc), unstructured data types (images, documents, networks, molecules, etc), or time series data.

Connect to a host of databases and data warehouses to integrate data from Oracle, Microsoft SQL, Apache Hive, and more. Load Avro, Parquet, or ORC files from HDFS, S3, or Azure.

Access and retrieve data from sources such as Twitter, AWS S3, Google Sheets, and Azure.

Extract and select features (or construct new ones) to prepare your dataset for machine learning with genetic algorithms, random search or backward- and forward feature elimination. Manipulate text, apply formulas on numerical data, and apply rules to filter out or mark samples.

We provide elite Data Science services from Data to data anywhere. We help our clients in UAE, Jordan and Saudi to unlock data value by utilising the power of the AI (Artificial Intellegence) and Machine Learning (ML) models. In short, data is the fuel of future development and Palmira provide your with the best tools and expertise to accomplish your vision. 

Continue reading



Liferay is the best low-code Digital Experience platform
to help you innovate faster and deliver real business value.

Visually develop your applications, easily integrate with any system, add your own custom code when you need it, and change applications at the speed of business.

Deliver personalized experiences for the entire customer journey.

Digital Experiance

A Flexible Platform to Fit Your Needs
Liferay DXP is designed to work within your existing business processes and technologies to build a custom solution that uniquely meets your needs. Great experiences don’t just happen, create them with Liferay DXP.

Connect Your Experiences and Systems For a Single View
Bring your business together onto a unified digital platform that makes it easy to host multiple systems in order to create cohesive customer experiences.

Create Personalised Experiences At Every Point in the Journey
Make each customer experience personal and stay relevant throughout their journeys. Liferay DXP makes it easy to create and deliver experiences to help you meet your

Customise Your Solutions Quickly On a Flexible Platform
Innovate and be ready for future needs on a customizable platform that grows alongside your needs. Liferay DXP is unrivalled when it comes to integration and interoperability.

How 1,300+ Companies Use Liferay’s Software
See how successful companies use Liferay DXP to accomplish what they need for their enterprises to win, serve and grow their customers. Drive Real Business Impact… Fast.

  • Intranets Equip employees with modern tools to get work done faster.
  • Websites Create personalised web experiences that attract and empower customers.
  • Customer Portals Build strong customer relationships that extend beyond the purchase.
  • Partner Portals Satisfy your partners with a platform built to handle complex scenarios.
  • Integration Platforms Unify your new and existing systems into one powerful platform.

Contact us to Discover IT’s Secret Weapon: DXPs Is there a piece of technology that will enable IT teams to successfully build the solutions their organisation needs? It’s not too good to be true; it’s a digital experience platform (DXP). DXPs are becoming an urgent necessity for global IT teams, helping organisations:
  ✔ Cut down on costs and resources
  ✔ Fulfil multiple needs through one platform
  ✔ Adjust for future needs quickly

See how your organisation can leverage a DXP as the centrepiece in a tech ecosystem that brings content, data, experiences, and applications into one layer for optimised digital experiences across the entire user journey. Automate your services today…

We help you Develop, Execute and Monitor enterprise-scale process automation and workflow management solutions. Define rules for your processes to automate decisions and change those rules at any time.

A low-code development platform allows users to create application software through graphical user interfaces and configuration. Low-code development reduces the amount of traditional hand-coding, enabling accelerated delivery of business applications. Technologies like Low Code development have the potential to deliver huge cost savings and efficiencies to systems, web and mobile apps, business processes and operations for both Government and Private Sector. Blend in the future IoT and analytics, and you are redefining the art of the possible and capturing real-time data streams from all end points to capture insights that can optimize current Government vertical services.

A low-code development platform allows users to create application software through graphical user interfaces and configuration. Low-code development reduces the amount of traditional hand-coding, enabling accelerated delivery of business applications and helps you innovate faster and deliver real business value.

Organisations around the globe are looking not only for a platform to revamp their services and unify digital identity but also to a software flexibility, as well as to a high standard implementation and innovative service providers, to ensure quality and help leap their business to the next level.
What makes us stand-out is the deep know how on designing services, modeling customer journeys, defining touchpoints, and monitoring the outcome of those services. As well as long experience to unlock data value through integration and data science use cases in the EMEA region.

Drive Real Business Impact… Fast.

Continue reading

IoT comulocity

IoT cumulocity

Dream big on the IoT with us!

Why go IoT alone?

Build on experience

Put our 10 years of successful IoT experience to work, including best-in-class approaches proven in 1,000s of solutions worldwide.

Reuse best practices

Access a proven project methodology, 500+ solution accelerators and 3,000+ reusable processes to accelerate your time to value.

Get help when you need it

Our enablement services and partnerships can help you develop and deploy your solution faster. Ask about Cumulocity IoT QuickStart to get a pilot project running in 6 weeks or less—enablement services included.

“I have seen many different platforms in various shapes and sizes and I can say that Cumulocity is probably one of the best platforms in terms of stability and purpose.”

Global IoT & Data Analytics Director, Gardner Denver

Let’s achieve IoT together!

Why chance IoT? The IoT can have a big impact on your future—so dream big with a leader. Software AG has a decade of experience in working with businesses of all sizes in implementing the IoT, and we’re consistently rated a leader by the analysts. You can:

Quickly and painlessly get started with the IoT

Connect OT to IT and seamlessly combine IoT insights with the processes that run your business

Grow with the freedom to choose between SaaS, PaaS or a custom, multi-vendor IoT solution Learn more:

Continue reading

WebMethods ESB

Enterprise Service Bus (ESB)


Api Management webMethods

webMethods has a long history of providing innovative, industry-leading integration solutions to organizations, both public and private, across the globe.

API Management

webMethods API Gateway

webMethods API Gateway enables you to securely expose your APIs to third-party developers, partners, and other consumers for use in web, mobile and Internet of Things (IoT) applications.  With webMethods API Gateway you can easily create APIs, define Service Level Agreement (SLA) policies, and seamlessly publish your APIs to webMethods Developer Portal.

Key benefits

  • Secure your APIs from malicious external attacks
  • Eliminate threats from specific IP addresses and mobile devices
  • Reduce or eliminate the need for unnecessary holes in your firewall
  • Ensure API access is limited to authorized and authenticated consumers
  • Change protocols, message formats or service locations without impacting consumer-provider relationships
  • Make the same underlying services available to new applications or APIs over a different protocol or security standard—without costly recoding
  • Collect API usage data for monetization and external billing solutions
  • Provide the same quality of service to external and internal developers and consumers
  • Improve customer experience across channels and touchpoints.


Secure APIs – webMethods API Gateway provides DMZ-level protection from malicious attacks initiated by external client applications. With API Gateway you can secure traffic between API consumer requests and the execution of services on API Gateway with Denial of Service (DoS) attacks based on IP address and specific mobile devices as well as message volume. API Gateway also provides virus scanner integration as well as helps avoid additional inbound firewall holes using reverse invoke, or inside-out, service invocations.

Mediation – webMethods API Gateway provides complete runtime governance of APIs published to external destinations. API Gateway enforces access token and operational policies, such as security policies for runtime requests between consumers and native services. API providers can enforce security, traffic management, monitoring and SLA management policies, transform requests and responses into expected formats, can perform routing and load balancing of requests, and can collect events metrics on API consumption and policy evaluation.

Monetization features – webMethods API Gateway provides API monetization features, including defining and managing API plans and packages, for easily supporting API subscriptions and charge-back services.

Dedicated, web-based user interface – webMethods API Gateway provides a single, web-based UI to perform all the administration and API-related tasks from the API creation, policy definition and activation to the creation of consumer applications and API consumption, as well as administrative activities.

Built-in dashboarding and usage analytics – webMethods API Gateway provides information about API Gateway events and API-specific events, as well as details about which APIs are more popular than others. This information is available in interactive dashboards so that API providers can understand how their APIs are being used, which in turn can help identify ways to improve their users’ experience and increase API adoption.

Support for SOAP and REST APIs – webMethods API Gateway supports both SOAP-based APIs as well as REST-based APIs. This support enables organizations to leverage their current investments in SOAP based APIs while they adopt REST for new APIs.

Developer Portal integration – webMethods API Gateway is integrated with webMethods Developer Portal to provide a complete API management solution. APIs created in API Gateway can be synchronized with webMethods Developer Portal for API discovery and access control, as well as for providing API user documentation and testing.

Message transformation, pre-processing and post-processing – webMethods API Gateway lets you configure an API and transform the request and response messages to suit your requirements. To do this, you can specify an XSLT file to transform messages during the mediation process. You can also configure an API to invoke webMethods Integration Server services to pre-process or post process the request or response messages.

Developers’ engagement – APIs can be published to Developer Portal from API Gateway for developers to discover them. Organizations can group APIs and define policy enforcements on them as a single unit, which can then be subscribed by the developers.

Clustering support – Multiple instances of API Gateway can be clustered together to provide scalability. API Gateways can easily allow a load balancer to be placed in front of the clustered API Gateway instance to properly distribute request messages.

DevOPS (CI/CD) – The solution fully supports automated CI/CD with support for automated deployment using both a scriptable deployment tool and thru APIs.

API-Enabled – All capabilities of the API-Gateway are available thru APIs which can be used for different purposes such as: deployment automation, activate/deactivate APIs, extract monitoring data and audit logs, etc.

Flexible and Distributed Deployment – The solution can be deployed: On-premises, In the DMZ, On private cloud infrastructures (e.g. AWS, Azure, Google) and as a PaaS (webMethods API Cloud). The Gateway is also available as a Docker container which provides an easy way of deploying the Gateway on to new environments.

Secure Deployment – When having API-Gateway both on-premises and in DMZ the solutions supports a unique concept called “Reverse Invoke” which makes firewall administration and security easier to manage since you don’t need to allow any incoming traffic from DMZ to the Intranet. The communication channel between the two gateways will be established inside out (from the Intranet to the DMZ).

webMethods Developer Portal

The API economy

Application Programming Interfaces (APIs) enable the efficient sharing of information and data across real-time, distributed cloud and mobile applications. Through that sharing, APIs can connect products or services to massive new communities, driving growth across a wide range of industries. This “API economy” broadens a company’s reach beyond direct sales, OEMs, and distributors to include virtually any developer interested in incorporating a company’s features and services into new social and mobile applications—driving up revenue opportunities.

For the developer community to find, read about, discuss, and test your APIs, you need webMethods Developer Portal. Developer Portal provides a consumer-centric UI for the discovery of APIs.

The portal exposes API documentation to third-party developers, manages the developer on-boarding process, and allows these developers to use the exposed APIs for creative new uses. When developers leverage your APIs with new mashups and apps or to support new devices, your reach is extended, and new channels are opened to your corporate assets. If you want to get on board the new API economy to reach new customers and unlock the business value of your corporate assets, you must make your APIs easily accessible to developers.

Key benefits

  • Single solution for both external and internal developers
  • Analytics at the portal to better understand your users
  • Intuitive user interface
  • Highly customizable look and feel
  • API security is ensured using API keys and OAuth2 credentials support
  • Seamless integration with other webMethods components


Branding – Customize and brand the portal in accordance with your company’s corporate identity (i.e., logos, skins and corporate colors). Add additional pages. Make it entirely your own!

Easy discovery and testing of APIs – webMethods API Gateway provides full text search capabilities that helps developers quickly find APIs of interest. API descriptions and additional documentation, usage examples, and information about policies enforced at the API level provide more details to help developers decide whether to use a particular API. Developers can use the code samples and expected error and return codes to try out APIs they are interested in, directly from within the Developer Portal.

Documentation – Rich descriptions of the APIs, examples of how to use the APIs, file attachments for additional documentation and information about policies enforced on the API level are all available on the portal.

Community Support – A collaborative community environment allows users to rate APIs and contribute to open discussions with other developers. Create groups for collaboration on single or multiple APIs. Administrators can announce administrative events and moderate discussions, as can coordinators who are defined during group creation.

API support – Designed for REST APIs, webMethods Developer Portal also fully supports traditional SOAP based APIs. This allows you to leverage your current investments in SOAP-based APIs while adopting REST for new APIs – Integrated API testing Developers can easily try out APIs directly within webMethods Developer Portal to see first-hand how the API behaves. Code samples and expected error/return codes with descriptions are provided. Try test invocations with different input parameters and quickly see the results.

Built-in usage analytics – Understand where your visitors are coming from, what pages gather most interest, which APIs are popular, and which are not. API providers can use this information to gain valuable API insights, improve the portal’s customer experience and increase API adoption by developers. This is especially useful to line-of-business and marketing executives to determine what is working and what is not, so that immediate corrective actions can be implemented. In addition, the Google Analytics™ plug-in can be used for additional insight on visitor traffic and how your marketing programs are performing.

Search – Developers can quickly find the APIs they need with full text search capabilities.

Design – Count on responsive design for both desktop and mobile access. Use the Web-based administration interface to manage users, groups, and permissions.

API grouping – Group APIs using definable criteria help developers discover APIs in larger API catalogs, such as free vs. paid, business domains, or public vs. B2B partner. Also, you can group APIs based on configurable maturity level, for example, beta APIs vs. final APIs.

Multiple deployment options – Deploy behind the firewall, in the DMZ, in your private cloud or in the public cloud. The choice is yours.

Built-in workflows – Use an approval process workflow to manage API access requests. Access tokens are automatically provisioned to the gateway infrastructure. Use an on-boarding workflow to allow users to sign up as a developer on the Developer Portal.

Track specific APIs – Sign up to track the APIs you are interested in and automatically receive notices of changes to them.

webMethods Micro-gateway

Along with distributed microservices architectures comes the challenge of managing microservices security.  Designed to sit closer to business logic and protect it, webMethods Microgateway is your solution.  This independent gateway—lightweight, agile and fast—works with webMethods API Gateway or as a stand-alone solution to secure your microservices in a distributed environment.  Highly efficient, the gateway uses a very small footprint and is fast to deploy.

Microservices and Micro-gateways go hand in hand

IT needs better and faster ways to scale infrastructures to meet dynamic business demands. That’s why microservices architectures are trending. Small, independently deployable services built around business capabilities are ideal for rapid development and continuous delivery. With a distributed architecture, you need to be able to scale up and down quickly while serving many more systems and gateways you don’t want to overload.

webMethods Microgateway answers this requirement perfectly. With a “micro” footprint, you can:

  • Manage API access to your microservices across a distributed architecture
  • Prevent main gateways from overloading
  • Reduce the impact from routing and traffic through a single gateway while supporting east-west traffic

Key Benefits

  • Secure and mediate API access to microservices
  • Apply routing policies and throttling to manage consumer-provider connectivity
  • Optionally federate Micro-gateways with API Gateway for centralized management and monitoring
  • Deploy in multiple form factors to support different scalability and management goals
  • Easily provision and scale across microservices architecture
  • Very low runtime footprint
  • Fast spin up

Manage API access to microservices with webMethods Microgateway

Multiple form factors

Provision webMethods Microgateway as a Java® instance or as a Docker® container with a micro-Linux® host. As a self-contained Java app, the Microgateway is a “headless” implementation that’s independent, lightweight and agile. In a “Dockerized” configuration, the Microgateway includes a micro-Linux host and is scalable and lean.

Flexible deployment patterns

Microservices architectures need different levels of granularity and control—so webMethods Microgateway gives you options. In a stand-alone deployment, the Microgateway can run independently from the microservice. When the microservice dies, the Microgateway continues to function. This option is preferable when the Microgateway is hosting multiple APIs and needs to be scaled independently. In a sidecar deployment, the Microgateway runs close to the business logic. It scales together with the microservice but likely only contains policies for a single microservice. This option leaves no network gaps and eliminates potential latency issues.

Seamless failover with service registry support

In a microservice landscape, service registries maintain information about service instances and their endpoints. If a microservice becomes unavailable in the cloud, a service registry will enable you to automatically failover to another running instance. If you choose a service registry, the Microgateway sends a request to the service registry to discover the IP address and port where the service is running. Improve service availability in the cloud by configuring a registry for endpoint management.

Traffic monitoring and control

Throttle traffic with policies to manage the load on provider services. Apply limits to service invocations during a specific time interval for identified clients. Log all traffic requests and responses for analysis.


The webMethods API Management Platform provides a powerful solution for API monetization, helping you manage the entire API life cycle more easily and expose APIs to external developers and other consumers.

With webMethods, you can:

  • Manage the process of designing, developing, deploying, versioning, and retiring APIs and services
  • Securely provision APIs, providing authentication, mediation, payload transformation and API monetization
  • Analyze usage of APIs, collecting metrics for performance dashboards, SLA violations and invoicing for API monetization
  • Gain real-time visibility into the status of service transactions as they flow across heterogeneous architecture
  • Get notification of events and alerts so you can take immediate action to address problems
  • Enable process automation and automatically create API documentation and provision policies
  • Integrate easily with back-end systems and applications
  • The rapidly expanding use of Application Programming Interfaces (APIs) is creating a virtual API economy, where APIs are the new distribution channel for products and services. With ever-increasing user demand for apps, companies are exploiting the new API economy by not only developing APIs internally but exposing APIs to thousands of third-party developers through API portal technology. APIs are enabling companies to reach new customers, target new sources of revenue and connect cloud applications to back-end services.
  • But even with growing user demand, turning APIs into profits is no simple feat. API monetization requires an API management solution that not only handles the development and implementation of APIs but streamlines authorization, billing and payment for API usage.

API Packages, Plans & Subscription Module

webMethods API Management provides a rich set of features and tools that let you participate more easily and profitably in the API economy. With webMethods, you can:

  • Ensure standards and best practices are met as APIs move through their life cycle
  • Enable developers to easily find, read about, discuss and test your APIs
  • Accelerate adoption by cataloguing your APIs for discovery, re-use and life-cycle management
  • Browse and search for APIs using built-in or custom taxonomies or powerful keyword search capabilities
  • Receive change notifications when any event impacts your APIs
  • Secure and mediate your APIs, monitoring API traffic to collect metrics for monetization
  • Gain real-time visibility into service transaction to easily find root-cause location of SLA violations

With API Gateway, you can define and manage API plans and packages to easily support API subscriptions. API monetization lets you create groups of APIs and offer them together as a subscription offering. You can even create different plans that support higher or lower numbers of transactions or customer support levels.

Fees Module & Payment Gateways

API Portal ships with default gateways that will help to configure your subscription module to respected gateway so that real-time metering can be achieved. It also possible to send these metrics to external source via REST API.

Enabling payment gateways (WorldPay or Stripe) from plug-in section.

Once the appropriate configurations are done consumer can add their credits while their registration.

Secure API Gateway

webMethods API Gateway is the security and policy enforcer for APIs and their internal applications and systems. The gateway provides a robust API runtime security that only welcomes authorized consumers by using reverse invoke or inside-out service invocations. This protective technique reduces the need to open holes in your firewall.

More specifically, webMethods API Gateway protects you from security threats with DMZlevel protection. You can securely expose your APIs to third-party developers, RAQMIYATs and other consumers with peace of mind. Secure the traffic between API requests and the runtime execution of your services in the gateway. Get protection from malicious attacks such as Denial of Service (DoS) based on IP address, specific mobile devices and even message volume.

Additionally, webMethods API Gateway also provides virus scanner integration, eliminating the need for additional inbound firewall holes using Software AG’s reverse invoke, or inside-out, service invocation technology. As a baseline measurement, webMethods API Gateway provides complete protection against the Top 10 API Security Risks identified by the Open

Web Application Security Project (OWASP).

OWASP’s top 10 API security risks as mitigated by webMethods API Gateway:

  1. Broken Object Level Access Control
  2. Broken Authentication
  3. Improper Data Filtering
  4. Lack of Resources & Rate Limiting
  5. Missing Function/Resource Level Access Control
  6. Mass Assignment
  7. Security Misconfiguration
  8. Injection Flaws
  9. Improper Assets Management
  10. Insufficient Logging & Monitoring

webMethods Integration Server

webMethods Integration Server is the enterprise-class foundation for service-based integration of applications and Web services and is the foundation of the webMethods Integration Platform. The webMethods Integration Platform will allow organisations to break free from the costly constraints of point-to-point integrations and siloed systems. With webMethods you can easily integrate your disconnected IT assets to streamline information exchange.

Our Integration Platform is standards-based and offers the most complete application integration infrastructure available. It “speaks” any technology so all your Web services, JMS messaging, packaged and custom apps and legacy systems – just about anything you might use to run your business – can communicate efficiently.  You can service-enable any technology from any vendor. That means you can extend existing IT assets and innovate quickly and cost effectively to meet new business needs. Plus, because webMethods Integration Server is at the core of the webMethods suite, you can build on your investment by adding additional capabilities and technologies, such as BPM, all of which are designed to work together.

Software AG’s Integration Platform Benefits

  • Connect application silos: Custom, packaged and mainframe applications and databases can all interoperate and exchange information easily.
  • Reduce maintenance costs: Reduce time and cost to integrate new applications by eliminating complex point-to-point connections.
  • Improve time to market for applications: Re-use existing assets and build new applications faster without jeopardizing quality.
  • Address big data: Maintain an authoritative database in-memory that combines data from multiple sources while ensuring its currency.
  • Enable enterprise mobile apps: Provision back-end data and capabilities to power enterprise mobile apps in a secure and well-managed fashion.
  • Improve partner relations: On-board partners faster and improve partner collaboration.
  • Unlock business value of unique data: Expose APIs to third-party developers to build new cloud, web, and mobile apps to reach new customers and opening new revenue streams.
  • Ensure enterprise data quality: Ensure a single version of reference

Terracotta BigMemory

Software AG’s Terracotta In-Memory Data Management Platform is the first-choice platform for distributed in-memory data management with extremely low, predictable latency at any scale.

Terracotta technology has been deeply integrated into the webMethods suite to provide API result caching that can be used to cache any kind of data in a local or distributed cache on commodity hardware without running into java garbage collection pauses or out-of-memory errors.

Key Benefits

  • Data access: in-memory real-time access to information
  • Broad applicability: real-time access to data from multiple client platforms
  • Predictable latency at extreme scale: ability to scale to even higher data limits—up to 100s of terabytes while providing greater fault tolerance
  • Lower TCO and operational flexibility: simple to use on commodity hardware
  • Continuous uptime: continuous availability of data with zero downtime across different deployment topologies


  • Cost-effective scaling: 10-100x more data on a single server (versus classic P2P In-Memory grids like Oracle® Coherence) delivers cost-effective scaling—1,000x faster than disk
  • Support for extended hybrid storage: leverages SSD® & Flash® technologies in addition to DRAM® in order to scale to high TB data levels predictably and economically
  • BigMemory SQL: support for SQL to query in memory data
  • Cross-language client support: access to BigMemory data from multiple client platforms (Java®, .NET/C# and C++)
  • High availability: Full fault-tolerance and Fast Restartable Store technology delivers 99.999 percent uptime
  • Multi-data center support: WAN data replication to keep data in sync across regions while offering support for disaster recovery
  • Management: Terracotta Management Console provides a customizable Web dashboard for advanced monitoring and administration of Terracotta deployments
  • Other features include:
    • Configurable consistency keeps data in sync across your array
    • Ehcache interface (Java’s de facto get/put API) means no need to rip up code
    • Additional platform certifications.

With an integrated infrastructure from Software AG, you can manage the entire lifecycle of your APIs. API Management powered by webMethods allows you to securely expose your APIs to external developers and partners.

Forrester Research named Software AG a Leader in The Forrester Wave™: Hybrid Integration for Enterprises, Q4 2016.

In this highly valued report, you’ll learn the results of “The Forrester Wave™: Hybrid Integration for Enterprises, Q4 2016” extensive research of the top Hybrid Integration Solutions, including each product’s overall ranking, specific capabilities and strengths and weaknesses

webMethods (BPMS)

The webMethods BPMS

The solution delivers everything you need to improve process speed, visibility, consistency, and agility. Capabilities include:

Process execution that let you run perfectly orchestrated processes in a transparent and efficient process landscape

Business rules management that enables you to define and change rules at any time without requiring the help of developers

Closed-loop analytics that let you take actions based on alerts from milestones, goals, and SLAs

A customizable inbox that lets you view, act on and collaborate on assigned tasks

A recommendation engine that guides you to complete tasks faster and more effectively

Task and team management to improve how people interact with your processes

Business activity monitoring to catch process problems in real time and prevent them from impacting customers or revenue.

Continue reading



ALFABET EA Design and Management

Increase return on IT Investment and decrease IT complixity

Accelerate your IT transformation with Alfabet

Digital transformation is pushing IT to do more. You’re tasked with mastering new tech, collaborating with the business on digital strategies, and recommending strategic investments for the future. How can you do more with confidence? Go with the leader in enterprise architecture management, IT portfolio management and IT planning tools—Alfabet.

We offer unique EA practice by assesing the fitness of application portflio against Business and technical requirements, utilising best practices such as TIME by Gartner.

Industry-analyst-recognized Alfabet helps you invest in IT wisely by managing your current IT portfolio and collaboratively planning for the future.

ALFABET and Enterprise Architecture

Palmira helps you build your Enterprise Architecture including Business Layer (Strategy ManagementBusiness Processes, Performance Management, Service Management), Application Layer, Infrastructure layer (manage your IT portfolio), and Data Layer. On top of your EA, we provide a comprehensive Governance model to manage your internal audit management and Risk Management.

“Enterprise architecture (EA) is a discipline for proactively and holistically leading enterprise responses to disruptive forces by identifying and analysing the execution of change toward desired business vision and outcomes. EA delivers value by presenting business and IT leaders with signature-ready recommendations for adjusting policies and projects to achieve target business outcomes that capitalise on relevant business disruptions. EA is used to steer decision making toward the evolution of the future state architecture”

Gartner “Enterprise Architecture (EA)” in IT Glossary at

Palmira enables organisations to manage their enterprise architecture containing four layers, called the business perspective, the application perspective, the information perspective, and the technology perspective:

  • The business perspective defines the processes and standards by which the business operates on a day-to-day basis.
  • The application perspective defines the interactions among the processes and standards used by the organisation.
  • The information perspective defines and classifies the raw data (such as document files, databases, images, presentations, and spreadsheets) that the organisation requires in order to efficiently operate.
  • The technology perspective defines the hardware, operating systems, programming, and networking solutions used by the organisation.

EA Benefits:

EA has proven benefits and advantages to organisations that include improved decision making, improved adaptability to changing demands or market conditions, elimination of inefficient and redundant processes, optimization of the use of organisational assets, and minimization of employee turnover.

Improve Agility, Increase Productivity, Enhance Transparency, Enforce Culture, Remove Gaps, and Reduce Cost.

In today’s digital and interconnected world, business as usual is not enough anymore!

Organisations should be agile and responsive to the new market requirements, regulation changes, and market trends. Palmira helps you be more agile, responsive, and reduce cost through smart, yet innovative solutions and implementation expertise. Our successful platforms and implementation expertise empowers you with a clear vision for your strategic objectives, keeping your KPIs more aligned towards success and excellence. Our solutions help you grow faster, more agile, and responsive in this ever-changing digital world. It also encourages planning, innovation, and creativity.

Application portfolio management:

The Application Portfolio Management Solution allows admin users to unify on-premises and SaaS applications to eliminate redundant systems and improve organisation and efficiency. Providing a console for mapping an organisation’s application landscape. Helping users standardise technologies, rationalise use cases, and align service level agreements. Understanding the application lifecycle and the impact on other artefacts.

Integrating data from applications in use to help IT managers identify an application’s value, reduce technology overlap, and mitigate risks. The solution presents the landscape in a workflow or mapping model, which allows users to group technologies by use or department and visualise their entire company’s software and service stack

Palmira delivered successful projects throughout the EMEA  region including KSA, UAE, Jordan, and Europe.

Continue reading

ARIS Aware

ARIS Aware

Powerful insights through

Palmira enriches your existing process models with live data from operational systems by using ARIS Aware from Software AG. ARIS Aware connects to a wide range of data sources with your process portal. By blending a process model with a running process instance, process cycle times or incidents, you can monitor the differences between the expected and the actual values in real-time – and realise significant new process improvements. Blending the design world with the operational world transforms your process portal business into a sound foundation for senior executives to make their best decisions. Add business criticality to your process portal with these ARIS Aware capabilities.

Data connectivity and mash-ups

Palmira helps you build dashboards from many data sources using ARIS Aware. ARIS Aware is equipped with ready-to-go templates so you can quickly get answers to critical questions like:

  • How good is the quality of my process model?
  • How often has a model been clicked?
  • What technology is used by a specific group of people?

Innovative, context-sensitive, and story-telling visualisation of KPIs and data analysis seamlessly integrated with ARIS.

Context-aware dashboards

ARIS Aware dashboards are easy to build and easy to understand. Show your users the most relevant information that is important to them with context-aware dashboards that any user can easily interpret. Users from any group or business function – such as HR, procurement, marketing, sales, or lines of business – will see the processes they are part of along with KPI’s that are relevant to their work. When correlations are easier to understand, users get more value out of the existing process content, which increases user adoption and end-user satisfaction.

Interactive visual analyses

Faster time to value through

Get there faster with preconfigured use case-driven dashboards that can easily be applied to your ARIS Connect portal. Standard dashboards are easy-to-use, customizable and include: Model Quality & Maturity Management, Process Performance Management, Enterprise Architecture Management, and Customer Experience Management.


Easier proof of value through

We enable users to interact in ARIS Connect and which information they need most. Demonstrate the value of your centre of process excellence to the senior executive team with user experience data. And with user insights, you can make your process portal more user-driven and intuitive than ever before.

ARIS aware MDOEE Palmira

Continue reading


ARIS “Business Process Analysis”


Software AG’s ARIS Business Modelling Platform is ideal for organisations that want to document, analyse, standardise and improve their processes.

Palmira is named as special innovation partner of Sofwtare AG, and proud to innovate 12 solutions in top of ARIS. You may visit ARIS Hub to see more details of those solutions.

With better processes across business, IT and SAP systems, your organisation can respond faster to changing mission and business. Use the ARIS Business Process Analysis (BPA) Platform to document and design your strategy, processes and architectures. It’s ideal for analysing, simulating and optimising your business processes with our mapping software. 

With ARIS Business Process Analysis, you can leverage the wisdom-of-the-crowd to support continuous process improvements and benefit from governance to keep processes under control. You also can build an effective enterprise management system while keeping customer experience management a priority. All these capabilities are key to your survival in the digital world.



Define, design and document agency wide architectures

  • Support for all industry standard frameworks (e.g. DoDAF, TOGAF, ArchiMate, etc.) and custom frameworks
  • Align mission and IT strategy with operational processes and IT architecture by combining ARIS Business Process Analysis and Alfabet capabilities
  • Analyse costs to improve process efficiency and cost effectiveness
  • Document business processes and dependencies between organisation, processes, data and IT-applications


  • Identify bottlenecks and run “what if” analyses
  • Provide version control and change history documentation
  • Govern process management to support change management and continuous improvement programs


  • Evaluate process use and quality, and then use KPIs to drive process optimization
  • Share process information with a flexible, role-based process portal
  • Create a common understanding of your business



  • ARIS offers an easy-to-use tool for enterprise-wide business process design. You can create models quickly and present them so they are easy to understand:
  • Create, analyse, manage and administer your process architecture
  • Document and align your corporate strategy with business processes and IT
  • Describe the resources needed for the processes and environment in which they operate


Analyse process information, such as time and costs, using standard or customised analysis and queries:


  • Manage process assets in the extensible method-based and multi-language ARIS repository:
  • Keep track of version control and change history
  • Create a “single truth” for process information
  • Collaborate and report without misunderstandings
  • Make process information a re-usable corporate asset


  • Share process information with a flexible and customizable role-based process portal:
  • Publish process information quickly, flexibly, reliably
  • Create easy and intuitive views of process models
  • Improve acceptability through business-oriented presentation


  • Unlock the power of collaborative process improvement:
  • Engage anyone, anywhere, anytime
  • Design, publish and create dashboards all in one tool
  • Link people, processes and IT


Evaluate processes in terms of quality and usage using KPIs:

  • Understand and increase the business value of processes
  • Identify optimization potential using proven analysis
  • Implement best-practice processes and procedures


Analyse business processes to improve process efficiency and cost effectiveness:

  • Identify your best processes
  • Understand and increase the business value of processes
  • Determine best use of resources


Align ARIS process blueprints with execution in webMethods BPMS:

  • Easily collaborate and align processes
  • Implement changes quickly
  • Automate processes successfully


Re-use defined business processes and design test cases graphically or automatically to:

  • Improve process quality in later project phases
  • Save time by re-using existing data and processes
  • Reduce risks of incomplete testing


  • Manage the process of process management:
  • Implement processes in a lightweight, model-driven way
  • Make changes without IT’s involvement
  • Reduce implementation efforts by 50 percent


A good process is no longer good enough; it’s the customer experience reflected in your processes that really matters. By designing and analysing customer journeys from the outside-in perspective, you get a chance to enhance their customer experience in order to differentiate from competition in terms of customer satisfaction and to reflect on your own processes from another point of view. But in fact, the Business Process Analysis (BPA) world missed developing a CXM solution that delivers the capabilities and the methods to design and analyse processes from an “outside-in” perspective. 

To close this gap, Palmira utilises ARIS to enable clients to improve their customer experience by designing customer journeys.

By taking customer emotions and expectations into account and using techniques such as customer journey mapping, customer touch-points analysis and identifying the critical Moments of Truth (MoT), your organisation can deliver a better customer experience. Using such approaches to improve customer satisfaction helps you:

  • Ensure better customer interactions, enhance customer satisfaction
  • Stay tuned to customers and recognize new ways to satisfy customer needs
  • Enhance customer loyalty, increase sales and revenue
  • Enhance measures and KPIs, reduce brand risk
  • Preserve business agility
  • Identify gaps and issues, recognize opportunities
  • Take advantage of new innovations

Customer Journey is essential for any industry and heavily utilised by the Government, Banking, Oil and Gas and Telecommunication sector.


Palmira keeps a simple business process methodology. We apply APQC. Starting with identifying Core, Support and Management Processes, we then define the value chain diagram in 2nd level. Then the 3rd level consists of processes names for the ease of navigation. While in 4th level, we document the process workflow. Capturing events and activities in the process, we link each activity to all related organisational context including: Services, Roles, Forms, Risk,… . The implementation promotes agility and responsiveness of the organisation, empowering governing and procedures related to the ecosystem.

Additionally, all our projects go through the 3M (Man, Method, Machine) practice:

Method: Designing the procedures according to global standards.      

Machine: Installing and configuring the solution according to the applied method.

Man: Knowledge transfer to empower human capital to run the solution with the right methodology.

Palmira delivers  pioneer Business Process Management services to several reputable names in Government, Banking and Telecommunication sectors. We delivered Business process Management projects that serve great initiatives. From Digital government to Open Finance and from paperless government initiatives to Open Banking. Opening the opportunity for businesses to re-engineer their processes, improve their operation and compliance. Helping our clients to design their Digital Future.

Continue reading



Strategy-to-Execute in one solution

Bridge the gap between Business Analyst (Requirements) and IT Developers.

Strategy-to-Execute in one solution

Bridge the gap between Business Analyst (Requirements) and IT Developers.

Sync+ enables capturing business requirements, sync requirements with use cases and user stories directly with IT developers tools (JIRA, EPM,…), give project manager the control to accept or reject changes, integrated with auto-testing tools (i.e. Katalon,.. ) to use the same user stories and use cases to push automated testing, and finally push back instant progress reports to the business team and feed the contribution all the way to strategic objectives.

SyncPlus enables you to:

  • Manage Full Cycle from Strategy to Execute
  • Break down silos between Solution Delivery and the rest of the Business.
  • Communicate the status of releases to business stakeholders and give visibility into your team’s progress.
  • Speed up your SDLC by integrating your clients’ and your business requirements with the most famous project management tools– in ONE-SINGLE-POINT-OF-TRUTH using ARIS 
  • Reduce error, cost, time, and effort while enhancing productivity, accuracy and reliability all of it while improving the quality of the cycle towards.
  • Seamless integration between business analysis team and Software development team.


The solution is dynamic and provides multiple integration points with the well known tools for project management, Development, Testing, (e.g. JIRA, Azure DevOps, Microsoft EPM, Zeplin, Katalon)

  • Availability of required ARIS Licences
  • Availability of required ARIS API Licences and Certificates

Availability of required JIRA, Azure DevOps, Microsoft EPM API Access

Continue reading

  • 1
  • 2

Your digital future starts here, Contact us!

Let's Talk

Our locations:

  • UAE | Dubai | AlMustaqbal Street | Business Bay |
    Exchange Tower | Office # 1703 | P.O.BOX 31712

  • Av. D. João II, Edifício Mythos Lote, 6º Piso,
    Escritório 2, 1990-095 Lisboa, Portugal


    QW93+G4 Lisbon, Portugal

  • Office#403, Al Abraj Almehaneyeh Complex
    Wasfi Al Tal Street, Amman

© Palmira. All rights reserved.