Skip to Content

IT Managed Services Provider Resource Recommendation Update on February 10, 2021

Knowledge of terms to know

What is Multi-tenancy?

Multi-tenancy is an architecture in which a single instance of a software application serves multiple customers. Each customer is called a tenant. Tenants may be given the ability to customize some parts of the application, such as color of the user interface (UI) or business rules, but they cannot customize the application’s code.

Multi-tenancy can be economical because software development and maintenance costs are shared. It can be contrasted with single-tenancy, an architecture in which each customer has their own software instance and may be given access to code. With a multi-tenancy architecture, the provider only needs to make updates once. With a single-tenancy architecture, the provider must touch multiple instances of the software in order to make updates.

Multi-tenancy in the cloud

In cloud computing, the meaning of multi-tenancy architecture has broadened because of new service models that take advantage of virtualization, containerization and remote access.

Public cloud providers rely on multi-tenant architectures to accommodate more users at the same time. Customers’ workloads are abstracted from the hardware and underlying software, allowing multiple users to reside on the same host.

“Kubernetes multi-tenancy adds multilayered complexity to an already complex Kubernetes security picture, and demands that IT pros wire together a stack of third-party and, at times, homegrown tools on top of the core Kubernetes framework.” – Beth Pariseau

Related Terms: multi-tenant cloud, Google Kubernetes, noisy neighbor, containers as a service (CaaS), cloud management, hyper-hybrid cloud, single tenancy

What is Internet Privacy?

Internet privacy refers to the vast range of technologies, protocols and concepts related to giving individual users or other parties more privacy protections in their use of the global Internet.

Internet privacy takes many forms, including mandatory privacy statements on websites, data sharing controls, data transparency initiatives and more.

Internet privacy and anonymity are paramount to users, especially as e-commerce continues to gain traction. Privacy violations and threat risks are standard considerations for any website under development.

Internet privacy is also known as online privacy.

What is Computer System?

A computer system is a basic, complete and functional hardware and software setup with everything needed to implement computing performance.

That’s the basic working definition of the computer system as we know it, but it has gone through a lot of formal changes over the past few decades.

What is Island hopping?

Island hopping, also called leapfrogging or pivoting, is a cybersecurity exploit in which an attacker gains access to an intended target by initially exploiting the employees and supply chain partners who have access to the target’s network.

In this type of lateral attack, the threat actor exploits a weakness downstream from the actual target and uses it as a launching point to reach the intended target. The term island hopping is inspired by a military strategy used in the Pacific theater during World War II.

Generally, island hopping attackers pick employees, customers and smaller companies that work with the target organization, hoping that their cyberdefense will be weaker than the ultimate target. Island hopping attacks often begin through phishing exploits in which the attacker disguises themselves as a reputable entity in an email or other communication channel.

“Lateral movement used to mean the attacker saying, ‘I landed on one endpoint on the network, and I want to get to the next one.’ Now, the attacker wants to leverage that entire connected ecosystem, and then they can do lots and lots of nefarious things.” – Rick McElroy

Related Terms You Should Know: supply chain security, supply chain risk management, service supply chain, business email compromise (BEC), vendor risk management

What is Artificial Neural Network (ANN)?

An artificial neuron network (ANN) is a computational model based on the structure and functions of biological neural networks. Information that flows through the network affects the structure of the ANN because a neural network changes – or learns, in a sense – based on that input and output.

ANNs are considered nonlinear statistical data modeling tools where the complex relationships between inputs and outputs are modeled or patterns are found.

ANNs are deep learning models capable of pattern recognition and machine learning. They are part of the broader artificial intelligence (AI) technology field.

ANN is also known as a neural network.

What is Software Development Life Cycle (SDLC)?

A software development life cycle (SDLC) model is a conceptual framework describing all activities in a software development project from planning to maintenance. This process is associated with several models, each including a variety of tasks and activities.

Software development is a cumbersome activity requiring proper identification of requirements, their implementation, and software deployment. However, the activities do not end there. After the distribution of the software, proper maintenance has to be provided in a timely manner.

This term is also known as the software development process model.

Here are some things that are going to be central to our explanation of the software development life cycle:

  • SDLC encompasses: planning, implementation, testing, documentation, deployment and maintenance.
  • Models shifted from traditional staged SDLC processes, to agile, and then to Devops.
  • Agile and Devops as practices merged traditional staging in new and interesting ways.
  • The cloud brought the arrival of web-delivered resources into the picture.
  • Although SDLC is now much changed, the concept remains largely the same.

What is Cloud application performance management (cloud APM)?

Cloud application performance management (cloud APM) is the process of monitoring resources that support software application performance in public cloud, private cloud and hybrid cloud environments, and ultimately taking actions to resolve issues and maintain optimal performance.

The major goals of cloud APM are the same as traditional APM: help administrators quickly identify and resolve any issues with a cloud-based application, which impact either the user experience (UX) or back-end functions such as security and costs.

The term APM is often used synonymously with a subcategory of management: application performance monitoring, which generally focuses on monitoring of metrics that underpin application performance and usability application performance. Note that APM tools have begun to evolve beyond basic monitoring and toward remediation, but true app management functionality is still mostly nascent, given the rapid growth of applications, their complexity and the involvement of teams and technologies to develop and maintain them.

In this context of cloud APM, issues are typically not remediated through the APM tool itself. The resolution process could involve on-premises adjustments for private cloud workloads, as well as tweaking cloud services and functions upon which the application depends. This may also include turning off the cloud service until the issue has been resolved.

By either interpretation of APM, the first step to identify and fix application performance problems is to know what’s happening. Software agents placed on the application server monitor application, service and database response times.

Administrators can use cloud APM tools to combine data from disparate monitoring silos into a correlation engine and dashboard, which makes audit logs easier to read and saves IT staff from memory-dependent and error-prone manual correlation and analysis. APM tools also can display graphical representation of how an application behaves on end-user devices — including an index-based graph to measure end user satisfaction or “happiness” — and gauge how service-based events affect these ratings.

“Moving to a cloud architecture presents new challenges in data center performance monitoring and management, including how to support application performance.” – Alan R. Earls

Related Terms: application server, application monitoring, real-time monitoring, correlation engine, audit log, cloud service level agreement

What is Secure Access Service Edge (SASE)?

Secure Access Service Edge, also known as SASE — pronounced “sassy” — is a cloud architecture model that bundles network and security-as-a-service functions together and delivers them as a single cloud service.

SASE allows organizations to unify their network and security tools in a single management console. This provides a simple security and networking tool that is independent of where employees and resources are located. SASE requires little to no hardware, using the widespread connectivity of cloud technology to combine SD-WAN with network security functions, including:

  • firewall as a service (FaaS)
  • software as a service (SaaS)
  • secure web gateways
  • cloud access security brokers (CASBs)
  • zero-trust network access

With the number of remote workers increasing, and organizations increasingly using cloud services to run applications, SASE offers a convenient, agile, cost-effective and scalable SaaS product for networking and security.

Organizations looking for a more advanced and user-centric network for their company network management needs would benefit from learning about SASE architectures. Due to the adoption of cloud services, mobile workforces and edge networks, the digital and cloud transformation is changing the way organizations are consuming network security. In the past, organizations would consume their security through legacy hardware networks and an outdated security architecture mindset.

“The SASE model is especially appealing to organizations that abundantly use the cloud and cloud services or are on a path to the cloud. This includes distributed organizations — for example, those with branch locations and dispersed end users, as well businesses with IoT and edge deployments.” — Mike Chapple

Related Terms: next-gen firewall, software as a service, gateway, cloud access security broker, zero-trust network access, edge computing

What is Web Development?

Web development refers in general to the tasks associated with developing websites for hosting via intranet or internet. The web development process includes web design, web content development, client-side/server-side scripting and network security configuration, among other tasks.

In a broader sense, web development encompasses all the actions, updates, and operations required to build, maintain and manage a website to ensure its performance, user experience, and speed are optimal.

It might also, but not necessarily, include all those strategic actions needed to ensure its proper ranking on search engine results. Usually, those tasks pertain to a different specialization, namely search engine optimization (SEO)

Web development is also known as website development, while the professionals that maintain a website are called web developers or (more commonly) web devs.

What is Online Marketing?

Online marketing is a set of tools and methodologies used for promoting products and services through the internet. Online marketing includes a wider range of marketing elements than traditional business marketing due to the extra channels and marketing mechanisms available on the internet.

Online marketing can deliver benefits such as:

  • Growth in potential
  • Reduced expenses
  • Elegant communications
  • Better control
  • Improved customer service
  • Competitive advantage

Online marketing is also known as internet marketing, web marketing, or digital marketing. It includes several branches such as social media marketing (SMM), search engine optimization (SEO), pay-per-click advertising (PPC), and search engine marketing (SEM).

What is Dark Web?

The dark web also referred to as the darknet, is an encrypted portion of the internet that is not indexed by search engines and requires special software and/or authorization to access the content.
Although the dark web is sometimes portrayed as a domain frequented by criminal elements, it was originally created for people who required privacy for legal reasons, such as exchanging intellectual property and other proprietary business information.

Sometimes the terms “dark web” and “deep web” are interchanged, but the two terms are not really synonyms. Neither dark nor deep web pages can be found on search engine results pages (SERP), but deep web pages can be accessed by anyone with a browser who knows the URL.

Dark web page access, in contrast, requires a special browser with the correct decryption key, as well as knowledge of where to find the content.

“The dark web primarily refers to sites and content that reside on overlay networks rather than the traditional internet; these sites, often referred to as “hidden services,” require specialized web browsers to access.” – Alexander Culafi

Related Terms: deep web, TOR browser, overlay network, HORNET, going dark, encryption, data anonymization, anonymous web surfing

What is Data Engineer?

A data engineer is a worker whose primary job responsibilities involve preparing data for analytical or operational uses. The specific tasks handled by data engineers can vary from organization to organization but typically include building data pipelines to pull together information from different source systems; integrating, consolidating and cleansing data; and structuring it for use in individual analytics applications.
The data engineer often works as part of an analytics team, providing data in a ready-to-use form to data scientists who are looking to run queries and algorithms against the information for predictive analytics, machine learning and data mining purposes. In many cases, data engineers also work with business units and departments to deliver data aggregations to executives, business analysts and other end users for more basic types of analysis to aid in ongoing operations.

Data engineers commonly deal with both structured and unstructured data sets — as a result, they must be versed in different approaches to data architecture and applications. A variety of big data technologies, including an ever-growing assortment of open source data ingestion and processing frameworks, are also part of the data engineer’s tool kit.

To carry out their duties, data engineers can be expected to have skills in such programming languages as C#, Java, Python, Ruby, Scala and SQL. They also need a good understanding of extract, transform and load tools and REST-oriented APIs for creating and managing data integration jobs, and providing data analysts and business users with simplified access to prepared data sets.

“Over 80% of the time spent on data science projects is often devoted to wrangling and preparing data for analysis. While most of the data preparation tasks fall on data engineers, data scientists can benefit from being able to do basic data profiling, cleansing and modeling tasks.” – Kathleen Walch

Related Terms: data scientist, citizen data scientist, data janitor, data preprocessing, data stewardship, data artist

Visual Paradigm Online is a network diagram tool with support for UML, Org Chart, Floor Plan, wireframe, family tree, ERD, and more. Features a simple, intuitive diagram editor and the ability to work collaboratively with your team.

A trick for rack mounting a heavy switch: “Screw a rack screw into the hole right below the one the device is going in, and rest the ears on those. The device should stay put while you lift one side up at a time and screw in the bottom screw.”

RUPS (Reading and Updating PDF Syntax) enables you to look inside a PDF document to see all the PDF objects and content streams. This tool is built atop iText.

Trello is a simple, intuitive app for organizing all your task lists and to-dos.

Time-savers tips:

  • What is that IP address? ping -a 192.168.xx.xx to return an A record lookup.
  • Press Ctrl-L in any browser or explorer window to highlight the address bar.

NetzTools is a secure, lightweight multitasking network app. It contains the following tools: show IP interface, ping, ping6, secure shell, telnet, port scan, traceroute, LAN scan, OUI lookup, and name lookup.

Invoke-GPOZaurr is a cmdlet found in the GPOZaurr PowerShell module that allows you to access a nice assortment of useful group policy reports. As “a tool to eat your Group Policies and tell you what’s wrong with them or give you data for further analysis with zero effort on your side.”

Threatpost provides the latest cybersecurity information for an audience of IT pros. Includes security news, videos, original feature reports, expert commentary and reader discussion on high-priority news.

CADE is a 2D vector editor that’s ideal for creating detailed network diagrams, flowcharts, schemas, maps, and more with an intuitive GUI. Its Visio-style functions allow you to drag-n-drop and connect predefined blocks, shapes, and both raster and vector images. Blocks/attribute collections can be modified and extended.

TFC Temp File Cleaner cleans out the folders that house temporary files for Java and Windows and the IE, Opera, Chrome, and Safari caches. It cleans the folders for all accounts on the computer, including Admin, NetworkService, and LocalService.