Search Our Website:

ADVANCED TECHNOLOGY TERM GLOSSARY

Below are some quick definitions of the common advanced technology terms your business may already be using or considering integrating.

All definitions were sourced from the Gartner IT Glossary and NIST Computer Security Resource Center Glossary.

Adaptive AI

Adaptive AI systems support a decision-making framework centered around making faster decisions while remaining flexible to adjust as issues arise. These systems aim to continuously learn based on new data at runtime to adapt more quickly to changes in real-world circumstances. The AI engineering framework can help orchestrate and optimize applications to adapt to, resist or absorb disruptions, facilitating the management of adaptive systems.

AI Engineering

AI engineering is foundational for enterprise delivery of AI solutions at scale. The discipline unifies DataOps, MLOps and DevOps pipelines to create coherent enterprise development, delivery (hybrid, multicloud, edge), and operational (streaming, batch) AI-based systems.

AI Simulation

AI simulation is the combined application of AI and simulation technologies to jointly develop AI agents and the simulated environments in which they can be trained, tested and sometimes deployed. It includes both the use of AI to make simulations more efficient and useful, and the use of a wide range of simulation models to develop more versatile and adaptive AI systems.

Bill of Materials

A bill of materials (BOM) is the data that identifies the items or raw materials used to produce any physical thing, whether that thing is a structure or a product.

Generative AI

Generative AI refers to AI techniques that learn a representation of artifacts from data, and use it to generate brand-new, unique artifacts that resemble but don’t repeat the original data. These artifacts can serve benign or nefarious purposes. Generative AI can produce totally novel content (including text, images, video, audio, structures), computer code, synthetic data, workflows and models of physical objects. Generative AI also can be used in art, drug discovery or material design.

Consent Management

Consent management is a system, process or set of policies for allowing consumers and patients to determine what health information they are willing to permit their various care providers to access. It enables patients and consumers to affirm their participation in e-health initiatives (patient portal, personal health record or health information exchange) and to establish privacy preferences to determine who will have access to their protected health information (PHI), for what purpose and under what circumstances. Consent management supports the dynamic creation, management and enforcement of consumer, organizational and jurisdictional privacy directives.

Cybersecurity

Cybersecurity is the combination of people, policies, processes and technologies employed by an enterprise to protect its cyber assets. Cybersecurity is optimized to levels that business leaders define, balancing the resources required with usability/manageability and the amount of risk offset. Subsets of cybersecurity include IT security, IoT security, information security and OT security.

Data Aggregation

Compilation of individual data systems and data that could result in the totality of the information being classified, or classified at a higher level, or of beneficial use to an adversary.

Data Broker

A Data Broker is a business that aggregates information from a variety of sources; processes it to enrich, cleanse or analyze it; and licenses it to other organizations. Data brokers can also license another company’s data directly, or process another organization’s data to provide them with enhanced results. Data is typically accessed via an application programming interface (API), and frequently involves subscription type contracts. Data typically is not “sold” (i.e., its ownership transferred), but rather it is licensed for particular or limited uses. (A data broker is also sometimes known as an information broker, syndicated data broker, or information product company.)

Data Center

A data center is the department in an enterprise that houses and maintains back-end IT systems and data stores — its mainframes, servers and databases.

Data Processing

The collective set of data actions (i.e., the complete data life cycle, including, but not limited to collection, retention, logging, generation, transformation, use, disclosure, sharing, transmission, and disposal.

Encryption

Cryptographic transformation of data (called “plaintext”) into a form (called “ciphertext”) that conceals the data’s original meaning to prevent it from being known or used. If the transformation is reversible, the corresponding reversal process is called “decryption,” which is a transformation that restores encrypted data to its original state.

Enterprise Risk Management

The methods and processes used by an enterprise to manage risks to its mission and to establish the trust necessary for the enterprise to support shared missions. It involves the identification of mission dependencies on enterprise capabilities, the identification and prioritization of risks due to defined threats, the implementation of countermeasures to provide both a static risk posture and an effective dynamic response to active threats; and it assesses enterprise performance against threats and adjusts countermeasures as necessary.

Information Life Cycle Management

Information life cycle management (ILM) is an approach to data and storage management that recognizes that the value of information changes over time and that it must be managed accordingly. ILM seeks to classify data according to its business value and establish policies to migrate and store data on the appropriate storage tier and, ultimately, remove it altogether. ILM has evolved to include upfront initiatives like master data management and compliance.

Large Language Model

A large language model (LLM) is a specialized type of artificial intelligence (AI) that has been trained on vast amounts of text to understand existing content and generate original content.

Machine Learning

Advanced machine learning algorithms are composed of many technologies (such as deep learning, neural networks and natural language processing), used in unsupervised and supervised learning, that operate guided by lessons from existing information.

Open Source

Open source describes software that comes with permission to use, copy and distribute, either as is or with modifications, and that may be offered either free or with a charge. The source code must be made available.

Operational Technology

Operational technology (OT) is hardware and software that detects or causes a change, through the direct monitoring and/or control of industrial equipment, assets, processes and events.

Penetration Testing

A test methodology in which assessors, typically working under specific constraints, attempt to circumvent or defeat the security features of a system.

Quantum Computing

Quantum computing is a type of nonclassical computing that operates on the quantum state of subatomic particles. The particles represent information as elements denoted as quantum bits (qubits). A qubit can represent all possible values simultaneously (superposition) until read. Qubits can be linked with other qubits, a property known as entanglement. Quantum algorithms manipulate linked qubits in their undetermined, entangled state, a process that can address problems with vast combinatorial complexity.

Remote Access

Access to an organizational information system by a user (or an information system) communicating through an external, non-organization-controlled network (e.g., the Internet).

Zero Day Attack

An attack that exploits a previously unknown hardware, firmware, or software vulnerability.

Zero Trust

A collection of concepts and ideas designed to minimize uncertainty in enforcing accurate, least privilege per-request access decisions in information systems and services in the face of a network viewed as compromised.