IT Managed Services Provider Resource Recommendation Update on April 15, 2021

Knowledge of terms to know

What is Tech ethicist?

Tech ethicist is a corporate role that involves examining a company’s technologies to ensure that they meet ethical standards, that they do not exploit user vulnerabilities, for example, or infringe upon user rights. The term also refers to independent experts.

Although there is no standard education stream for tech ethicists yet, to fill that role an individual would need grounding in not only ethics and technology but also psychology, law and sociology, among other things. Tech ethicist David Polgar likes to compare the tasks of engineers and ethicists: Engineers see a problem and find a solution, after which the ethicist sees the solution and looks for problems.

Technology ethics is an increasingly important area of focus as the sophistication and capacities of technologies have advanced far ahead of concerns for security, privacy and the well-being of users. The humane tech movement seeks to change that focus to realign technology with humanity. As that movement develops, the demand for tech ethicists is likely to grow.

“Data breaches and privacy scandals have made it vital for organizations to prioritize tech ethics and consider the effects IoT devices and AI will have on individuals and society.” – Jessica Groopman

Related Terms: data protection officer (DPO), AI code of ethics, responsible AI, California Consumer Privacy Act, General Data Protection Regulation

What is First Person Shooter (FPS)?

A first person shooter (FPS) is a genre of action video game that is played from the point of view of the protagonist. FPS games typically map the gamer’s movements and provide a view of what an actual person would see and do in the game.

A FPS usually shows the protagonist’s arms at the bottom of the screen, carrying whatever weapon is equipped. The gamer is expected to propel his avatar through the game by moving it forward, backward, sideways and so on using the game controller. Forward movements of the controller result in the avatar moving forward through the scenery, usually with a slight left-right rocking motion to properly simulate the human gait. In order to increase the level of realism, many games include the sounds of breathing and footsteps in addition to the regular sound effects.

FPS games can be played in two general modes, mission or quest mode and multiplayer mode. The mission mode is usually the default mode for a single player. It usually involves the player battling through progressively harder game levels towards some ultimate goal. The multiplayer mode involves multiple gamers participating via a network and playing in a shared game environment. The multiplayer mode can take many forms, including:

  • Deathmatches
  • Capture the flag
  • Team deathmatch
  • Search and destroy
  • Base (a.k.a assault or headquarters)
  • Last man standing

First person shooter primarily refers to the perspective of the game. There are other genres that also use the first person perspective occasionally, including racing games and boxing games. Shooters, that is games where various weapons (but mostly guns) are used to kill opponents, have been made using all the basic gaming perspectives: first person, third person, side scrolling, top-down and 3/4.

The first FPS was “Maze War”, developed in 1973. However, it was the 1992 “Wolfenstein 3D” game that really entrenched the concept. Some of the most influential FPS games include “Doom”, “Quake” and the “Half-Life: Counter Strike ” series. All gained dedicated followers.

There is a wide variety of FPS games on the market. Many can be played on different platforms, including PCs, gaming consoles and handheld devices.

What is Symmetric Encryption?

Symmetric encryption is a form of computerized cryptography using a singular encryption key to guise an electronic message. Its data conversion uses a mathematical algorithm along with a secret key, which results in the inability to make sense out of a message. Symmetric encrpytion is a two-way algorithm because the mathematical algorithm is reversed when decrypting the message along with using the same secret key.

Symmetric encryption is also known as private-key encryption and secure-key encryption.

The two types of symmetric encryptions are done using block and stream algorithms. Block algorithms are applied to blocks of electronic data. Specified set lengths of bits are transformed, while simultaneously using the selected secret key. This key is then applied to each block. However, when network stream data is being encrypted, the encryption system holds the data in its memory components waiting for the blocks in their entirety. The time in which the system waits can yield a definite security gap, and may compromise data protection. The solution involves a process where the block of data could be lessened and combined with previous encrypted data block contents until the rest of the blocks arrive. This is known as feedback. When the entire block is received, then it is encrypted.

Conversely, stream algorithms are not held in the encryption system memory, but arrive in data stream algorithms. This type of algorithm is considered somewhat more secure, since a disk or system is not holding on to the data without encryption in the memory components.

What is Microlearning (Microtraining)?

Microlearning is an educational strategy that breaks complex topics down into short-form, stand-alone units of study that can be viewed as many times as necessary, whenever and wherever the learner has the need. Microlearning instructional modules are designed to be consumed in about five minutes and address one specific skill or knowledge gap topic.

The convenience of microlearning, from both the learner and the educator’s point of view, has made this type of instructional delivery popular in corporate learning environments. Scientific research suggests that a self-directed, modular approach to talent pipeline development improves knowledge retention. It also empowers employees by giving them the opportunity to build new skills directly in the context of the job they are being paid to do, without having to take time away from their job to attend training.

Although microlearning is most often associated with independent learning, modules can also be strung together to create guided learning experiences for individuals or small groups. The small chunks of instructional content can be tagged with metadata for easy search, access and reuse.

“Microlearning modules are most often accessed as the need for knowledge arises, but they can also be assigned as part of an employee’s monthly or quarterly goals.” – Wesley Chai

Related Terms: talent pipeline, learning experience platform (LXP), knowledge base, learning management system (LMS), chief learning officer (CLO)

What is Sandbox?

A sandbox is a style of game in which minimal character limitations are placed on the gamer, allowing the gamer to roam and change a virtual world at will. In contrast to a progression-style game, a sandbox game emphasizes roaming and allows a gamer to select tasks. Instead of featuring segmented areas or numbered levels, a sandbox game usually occurs in a “world” to which the gamer has full access from start to finish.

A sandbox game is also known as an open-world or free-roaming game.

Sandbox games can include structured elements – such as mini-games, tasks, submissions and storylines – that may be ignored by gamers. In fact, the sandbox game’s nonlinear nature creates storyline challenges for game designers. For this reason, tasks and side missions usually follow a progression, where tasks are unlocked upon successful task completion.

Sandbox game types vary. Massive multiplayer online role-playing games (MMORPG) generally include a mixture of sandbox and progression gaming and heavily depend on emergent interactive user gameplay for retaining non-progression-focused gamers. Modern “beat ’em ups” and first-person shooters have delved more deeply into the sandbox realm with titles like the “Grand Theft Auto” series, “Red Dead Redemption,” “Assassin’s Creed” and others, allowing gamers to run and gun wherever the mood takes them.

In spite of their name, various sandbox games continue to impose restrictions at some stages of the game environment. This can be due the game’s design limitations, or can be short-run, in-game limitations, such as some locked areas in games that are unlocked once certain milestones are achieved.

What is Massively Multiplayer Online Role-Playing Game (MMORPG)?

A massively multiplayer online role-playing game (MMORPG) is a video game that takes place in a persistent state world (PSW) with thousands, or even millions, of players developing their characters in a role-playing environment. The virtual world in which the game takes place is never static. Even when a player is logged off, events are occurring across the world that may impact the player when he or she logs in again.

Unlike traditional console-based role-playing games where the overarching goal is to complete the game, MMORPGs depend on emergent game play based on the interactions of players and groups of players. Most MMORPGs still provide tasks and battles that get progressively harder, but the primary purpose of these is to help gamers build up their characters in terms of experience, abilities and wealth.

To keep the gaming experience evolving, MMORPGs allow players to form alliances, interact within the game, customize their avatars and even create some of the game content. Moreover, players who are not interested in entering dungeons and battles to build up their characters can still participate in the game by setting up shops in villages and cities to contribute to the authenticity of the game’s world.

MMORPGs also have their own economies, where players can use the virtual currency they’ve earned in battles to buy items. This virtual economy has crossed over into the real world in some areas. For example, MMORPG players have exchanged real currency for items and virtual currency. In some instances, players seeking to level-up their characters more quickly have employed farmers – gamers who play as another person’s character – who work to earn experience points for their employers while they are logged off.

What is Hybrid Workforce?

A hybrid workforce is a type of blended workforce comprising employees who work remotely and those who work from an office or central location. This way, employees can work from the places they want to, either in a central location, such as a warehouse, factory or retail location, or in a remote location, such as the home.

However, a hybrid workforce isn’t just about working from home or working from the office; rather, it’s about helping employees achieve a flexible work-life balance.

Hybrid workforces enable employees to work in a setting that’s most comfortable for them. If workers feel they are more productive in one location versus another, they can choose to work in that environment — or work in a combination of the two.

The hybrid workplace model has also put the health, safety and psychological needs of workers first by allowing for social distancing during the COVID-19 pandemic. A survey from Enterprise Technology Research (ETR) expected the number of employees working from home to double into 2021 and after the eventual post-pandemic landscape. Additionally, at their Directions 2021 virtual conference, IDC predicted employees will have a choice about including a hybrid workplace model — with about 33% of employees still having to work on site full time.

“Hybrid workplace models blend remote work with in-office work. Instead of structuring work around desks in a physical office space, hybrid work generally enables employees to structure work around their lives.” – Linda Rosencrance

Related Terms: human capital management (HCM), diversity, equity and inclusion (DEI), team collaboration tools, contingent workforce, workforce management

What is gig economy?

A gig economy is a free market system in which temporary positions are common and organizations hire independent workers for short-term commitments. The term “gig” is a slang word for a job that lasts a specified period of time; it is typically used by musicians. Examples of gig employees in the workforce could include work arrangements such as freelancers, independent contractors, project-based workers and temporary or part-time hires.

There has been a trend toward a gig economy in recent years. There are a number of forces behind the rise in short-term jobs. For one, the workforce is becoming more mobile and work can increasingly be done remotely via digital platforms. As a result, job and location are being decoupled. That means that freelancers can select among temporary jobs and projects around the world, while employers can select the best individuals for specific projects from a larger pool than what’s available in any given area.

Digitization has also contributed directly to a decrease in jobs as software replaces some types of work to maximize time efficiency. Other influences include financial pressures on businesses leading to a flexible workforce and the entrance of the millennial generation into the labor market. People tend to change jobs several times throughout their working lives, especially millennials, and the gig economy can be seen as an evolution of that trend.

The gig economy is part of a shifting cultural and business environment that also includes the sharing economy, the gift economy and the barter economy. The cultural impact of the gig economy continues to change, for example, with the advent of COVID-19 in 2020 — where the pandemic has had a large influence.

“In a gig economy, businesses save resources in terms of benefits, office space and training. They also have the ability to contract with experts for specific projects who might be too high-priced to maintain on staff.” – Alexander S. Gillis

Related Terms: hybrid workforce, contingent workforce, ghost worker, fractional CIO , workforce planning

What is Symmetric Encryption?

Symmetric encryption is a form of computerized cryptography using a singular encryption key to guise an electronic message. Its data conversion uses a mathematical algorithm along with a secret key, which results in the inability to make sense out of a message. Symmetric encrpytion is a two-way algorithm because the mathematical algorithm is reversed when decrypting the message along with using the same secret key.

Symmetric encryption is also known as private-key encryption and secure-key encryption.

The two types of symmetric encryptions are done using block and stream algorithms. Block algorithms are applied to blocks of electronic data. Specified set lengths of bits are transformed, while simultaneously using the selected secret key. This key is then applied to each block. However, when network stream data is being encrypted, the encryption system holds the data in its memory components waiting for the blocks in their entirety. The time in which the system waits can yield a definite security gap, and may compromise data protection. The solution involves a process where the block of data could be lessened and combined with previous encrypted data block contents until the rest of the blocks arrive. This is known as feedback. When the entire block is received, then it is encrypted.

Conversely, stream algorithms are not held in the encryption system memory, but arrive in data stream algorithms. This type of algorithm is considered somewhat more secure, since a disk or system is not holding on to the data without encryption in the memory components.

What is Video Conferencing?

Video conferencing is live, visual connection between two or more remote parties over the internet that simulates a face-to-face meeting. Video conferencing is important because it joins people who would not normally be able to form a face-to-face connection.

At its simplest, video conferencing provides transmission of static images and text between two locations. At its most sophisticated, it provides transmission of full-motion video images and high-quality audio between multiple locations.

In the business world, desktop video conferencing is a core component of unified communications (UC) applications and web conferencing services, while cloud-based virtual meeting room services enable organizations to deploy video conferencing with minimal infrastructure investment.
The video conferencing process can be split into two steps: compression and transfer.

During compression, the webcam and microphone capture analog audiovisual (AV) input. The data collected is in the form of continuous waves of frequencies and amplitudes. These represent the captured sounds, colors, brightness, depth and shades. In order for this data to be transferred over a normal network — instead of requiring a network with massive bandwidth — codecs must be used to compress the data into digital packets. This enables the captured AV input to travel faster over broadband or Wi-Fi internet.

During the transfer phase, the digitally compressed data is sent over the digital network to the receiving computer. Once it reaches the endpoint, the codecs decompress the data. The codecs convert it back into analog audio and video. This enables the receiving screen and speakers to correctly view and hear the AV data.

“When the pandemic hit and everyone shifted to video, the major concern was simply connecting so that business could continue. But after almost a year of seeing ourselves looking like ghouls with bad lighting, or just occupying that same boring video box, people are looking to jazz things up.” – David Maldow

Related Terms: Unified Communications, compression, codec, virtual meeting room, latency

What is Hybrid Encryption?

Hybrid encryption is a mode of encryption that merges two or more encryption systems. It incorporates a combination of asymmetric and symmetric encryption to benefit from the strengths of each form of encryption. These strengths are respectively defined as speed and security.

Hybrid encryption is considered a highly secure type of encryption as long as the public and private keys are fully secure.

A hybrid encryption scheme is one that blends the convenience of an asymmetric encryption scheme with the effectiveness of a symmetric encryption scheme.

Hybrid encryption is achieved through data transfer using unique session keys along with symmetrical encryption. Public key encryption is implemented for random symmetric key encryption. The recipient then uses the public key encryption method to decrypt the symmetric key. Once the symmetric key is recovered, it is then used to decrypt the message.

The combination of encryption methods has various advantages. One is that a connection channel is established between two users’ sets of equipment. Users then have the ability to communicate through hybrid encryption. Asymmetric encryption can slow down the encryption process, but with the simultaneous use of symmetric encryption, both forms of encryption are enhanced. The result is the added security of the transmittal process along with overall improved system performance.

What is Project Charter (PC)?

A project charter (PC) is a document that states a project exists and provides the project manager with written authority to begin work.

The document helps the project manager to communicate his authority and explain to project participants and stakeholders why the project is needed, who it involves, how long the project will take to complete, how much it will cost, what resources are needed and how successful completion of the project will help the organization. Once created, the document is rarely (if ever) amended.

Depending on a company’s culture and management style, a charter may serve the same purpose as a business case. In a large corporation, the charter may be a multi-page document written up by mid-level management after a business case has been approved but before the project scope has been defined. In a small startup company, however, the charter might just be a few paragraphs with bulleted items and the company president’s signature.

” What usually happens – and is perfectly fine – is that the project manager will write an outline or even the charter itself, and ask the sponsor to review it.” – Tamar Freundlich

Related Terms: project management framework, project charter, PMP, PMO, CAPM

What is Advanced Encryption Standard (AES)?

The Advanced Encryption Standard (AES) is a symmetric-key block cipher algorithm and U.S. government standard for secure and classified data encryption and decryption.

In December 2001, the National Institute of Standards (NIST) approved the AES as Federal Information Processing Standards Publication (FIPS PUB) 197, which specifies application of the Rijndael algorithm to all sensitive classified data.

The Advanced Encryption Standard was originally known as Rijndael.

The AES has three fixed 128-bit block ciphers with cryptographic key sizes of 128, 192 and 256 bits. Key size is unlimited, whereas the block size maximum is 256 bits. The AES design is based on a substitution-permutation network (SPN) and does not use the Data Encryption Standard (DES) Feistel network.

In 1997, the NIST initiated a five-year algorithm development process to replace the DES and Triple DES. The NIST algorithm selection process facilitated open collaboration and communication and included a close review of 15 candidates. After an intense evaluation, the Rijndael design, created by two Belgian cryptographers, was the final choice.

The AES replaced the DES with new and updated features:

  • Block encryption implementation
  • 128-bit group encryption with 128, 192 and 256-bit key lengths
  • Symmetric algorithm requiring only one encryption and decryption key
  • Data security for 20-30 years
  • Worldwide access
  • No royalties
  • Easy overall implementation

What is Encryption Algorithm?

An encryption algorithm is a component for electronic data transport security. Actual mathematical steps are taken and enlisted when developing algorithms for encryption purposes, and
varying block ciphers are used to encrypt electronic data or numbers.

Encryption algorithms help prevent data fraud, such as that perpetrated by hackers who illegally obtain electronic financial information. These algorithms are a part of any company’s risk management protocols and are often found in software applications.

Encryption algorithms assist in the process of transforming plain text into encrypted text, and then back to plain text for the purpose of securing electronic data when it is transported over networks. By coding or encrypting data, hackers or other unauthorized users are generally unable to access such information. Some encryption algorithms are considered faster than others, but as long as algorithm developers, many of whom have math backgrounds, stay on top of advancements in this technology, this type of encryption should continue to flourish as hackers continue to become more sophisticated.

In 1977, RSA became one of the first encryption algorithms developed by U.S. mathematicians Ron Rivest, Adi Shamir and Len Adleman. RSA has had ample staying power as it is still widely used for digital signatures and public key encryption. Encryption algorithms can vary in length, but the strength of an algorithm is usually directly proportional to its length.

What is Big Data?

Big data is a combination of structured, semistructured and unstructured data collected by organizations that can be mined for information and used in machine learning projects, predictive modeling and other advanced analytics applications.

Systems that process and store big data have become a common component of data management architectures in organizations. Big data is often characterized by the 3Vs: the large volume of data in many environments, the wide variety of data types stored in big data systems and the velocity at which the data is generated, collected and processed. These characteristics were first identified by Doug Laney, then an analyst at Meta Group Inc., in 2001; Gartner further popularized them after it acquired Meta Group in 2005. More recently, several other Vs have been added to different descriptions of big data, including veracity, value and variability.

Although big data doesn’t equate to any specific volume of data, big data deployments often involve terabytes (TB), petabytes (PB) and even exabytes (EB) of data captured over time.

Companies use the big data accumulated in their systems to improve operations, provide better customer service, create personalized marketing campaigns based on specific customer preferences and, ultimately, increase profitability. Businesses that utilize big data hold a potential competitive advantage over those that don’t since they’re able to make faster and more informed business decisions, provided they use the data effectively.

For example, big data can provide companies with valuable insights into their customers that can be used to refine marketing campaigns and techniques in order to increase customer engagement and conversion rates.

“Ultimately, the value and effectiveness of big data depend on the workers tasked with understanding the data and formulating the proper queries to direct big data analytics projects.” – Bridget Botelho

Related Terms: big data analtyics, data types, unstructured data, semi-structured data, structured data, 5-V’s of big data

What is Deep learning?

Deep learning is a collection of algorithms used in machine learning, used to model high-level abstractions in data through the use of model architectures, which are composed of multiple nonlinear transformations. It is part of a broad family of methods used for machine learning that are based on learning representations of data.

Deep learning is a specific approach used for building and training neural networks, which are considered highly promising decision-making nodes. An algorithm is considered to be deep if the input data is passed through a series of nonlinearities or nonlinear transformations before it becomes output. In contrast, most modern machine learning algorithms are considered “shallow” because the input can only go only a few levels of subroutine calling.

Deep learning removes the manual identification of features in data and, instead, relies on whatever training process it has in order to discover the useful patterns in the input examples. This makes training the neural network easier and faster, and it can yield a better result that advances the field of artificial intelligence.

What is CompTIA Security+?

CompTIA Security+ is a certification for basic security practices and functions.

The Computing Technology Industry Association (CompTIA) advertises this security certification as one of the first security-based certifications information technology professionals should earn.

CompTIA does not set prerequisites for the Security+ certification. There are no age or educational requirements, but CompTIA does recommend that a candidate has at least two years of IT administration experience with a focus on security. This certification exam can be taken online or in person at a designated test center. The test takes 90 minutes to complete and requires a passing score of 750 out of 900.

The CompTIA Security+ certification is good for three years and covers subject areas such as:

  • Threats and vulnerabilities
  • Risk management and mitigation
  • Threat detection and management
  • Network security
  • Application and host security
  • Operational security
  • Access control
  • Identity management
  • Cryptography

It is important to note that CompTIA Security+ gets updated every three years to meet any changes to industry needs. Certification renewals help ensure that IT pros have the skills needed for modern cybersecurity jobs.

“The CompTIA Security+ exam focuses on best practices for risk management and mitigation…Anyone looking to start their path in the security field should consider beginning their journey with the CompTIA Security+ certification.” – Alexander S. Gillis

Related Terms: incident response, penetration testing, risk mitigation, IAM, cloud engineer, access control

What is Advanced Encryption Standard (AES)?

The Advanced Encryption Standard (AES) is a symmetric-key block cipher algorithm and U.S. government standard for secure and classified data encryption and decryption.

In December 2001, the National Institute of Standards (NIST) approved the AES as Federal Information Processing Standards Publication (FIPS PUB) 197, which specifies application of the Rijndael algorithm to all sensitive classified data.

The Advanced Encryption Standard was originally known as Rijndael.

The AES has three fixed 128-bit block ciphers with cryptographic key sizes of 128, 192 and 256 bits. Key size is unlimited, whereas the block size maximum is 256 bits. The AES design is based on a substitution-permutation network (SPN) and does not use the Data Encryption Standard (DES) Feistel network.

In 1997, the NIST initiated a five-year algorithm development process to replace the DES and Triple DES. The NIST algorithm selection process facilitated open collaboration and communication and included a close review of 15 candidates. After an intense evaluation, the Rijndael design, created by two Belgian cryptographers, was the final choice.

The AES replaced the DES with new and updated features:

  • Block encryption implementation
  • 128-bit group encryption with 128, 192 and 256-bit key lengths
  • Symmetric algorithm requiring only one encryption and decryption key
  • Data security for 20-30 years
  • Worldwide access
  • No royalties
  • Easy overall implementation

What is Project Management Office (PMO)?

A project management office (PMO) is a group or department within a business, agency or enterprise that defines and maintains standards for project management within the organization.

The primary goal of a PMO is to achieve benefits from standardizing and following project management processes, policies and methods. For the office to be most effective, it should embody the organization’s culture and strategy.

The popularity of the office has increased, as more companies with PMOs have received returns on investment.

Nearly seven in 10 organizations globally have a PMO, a figure that has remained constant for five consecutive years, according to the 2016 Pulse of the Profession report by the Project Management Institute (PMI).

“Whether it’s software development projects, network infrastructure deployments, or PC and system upgrades, there’s a lot of opportunity across the IT landscape for those with the right IT project management certifications and courses under their belt.” – Steve Zurier

Related Terms: Agile project management, project planning, project management framework, project management professional, Project Management Body of Knowledge

What is Data Encryption Standard (DES)?

The data encryption standard (DES) is a common standard for data encryption and a form of secret key cryptography (SKC), which uses only one key for encryption and decryption. Public key cryptography (PKC) uses two keys, i.e., one for encryption and one for decryption.

In 1972, the National Bureau of Standards (NBS) approached the Institute for Computer Sciences and Technology (ICST) to devise an encryption algorithm to secure stored and transmitted data. The algorithm would be publicly available, but its key would be top secret.

The National Security Agency (NSA) assisted with the cryptographic algorithm evaluation processes, and in 1973, submission invitations were posted in the Federal Register. However, the submissions were unacceptable. In 1974, a second invitation was posted, which resulted in a submission from IBM. In 1975, technical specifications were published for comments in the Federal Register, and analysis and review commenced. In 1977, NBS issued the algorithm, i.e., DES, as Federal Information Processing Standards (FIPS) 46.

Shortly thereafter, the U.S. Department of Defense (DoD) implemented DES. Specifications are outlined in FIPS publication 46-3, FIPS 81, ANSI X3.92 and ANSI X3.106. For security reasons, the U.S. government has never authorized exports of this encryption software.

There are at least 72 quadrillion DES key possibilities. In 1993, NIST recertified DES, and the Advanced Encryption Standard (AES) became its unofficial replacement.

What is Automatic machine learning (AutoML)?

Automatic machine learning (AutoML) is a general discipline that involves automating any part of the entire process of the machine learning application. By working with various stages of the machine learning process, engineers develop solutions to expedite, enhance and automate parts of the machine learning pipeline.

Automatic machine learning is also known as automated machine learning.

Some automatic machine learning techniques and tools are geared toward expediting and automating data preparation – the aggregation of overall data from various sources. Other parts of this process are aimed at feature engineering – feature selection and feature extraction are a big part of how machine learning algorithms work. Automating these can further improve the machine learning design process.

Another part of automatic machine learning is hyperparameter optimization, which is done through various means. Engineers can use metaheuristics techniques like simulated annealing or other processes to make automatic machine learning happen. The bottom line is that automatic machine learning is a broad catch-all term for any technique or effort to automate any part of the machine learning “end to end” process.

Free Tool

perfSONAR is an open-source network measurement toolkit that provides visibility to the nuances of your network to help with debugging. It offers federated coverage of paths and helps establish end-to-end usage expectations.

Network UPS Tools provides support for assorted power devices, like UPSs and PSUs. It offers many control and monitoring features and a uniform control and management interfaces. Covers over 140 manufacturers and thousands of power device models.

CherryTree is a hierarchical, wiki-style notetaking application for organizing your notes, bookmarks, source code, and more. Features rich text, syntax highlighting, and the ability to prioritize information.

Kate is a feature-packed editor for viewing and editing text files. Offers a wide variety of plugins, including an embedded terminal that can launch console commands, a powerful search and replace, on-the-fly spellcheck, and a preview that shows how your MD, HTML, and SVG will look. Supports highlighting for 300+ languages understand how brackets work and help navigate complex code block hierarchies.

Apcupsd is designed for power management and control of most of APC’s UPS models on Unix and Windows machines. During a power failure, it notifies users that a shutdown may occur. If power is not restored, a system shutdown will follow when the battery is exhausted, a timeout (seconds) expires, or runtime expires based on internal APC calculations determined by power consumption rates. Apcupsd works with most of the Smart-UPS models and most simple signaling models such as Back-UPS and BackUPS-Office.

Pping measures the roundtrip delay that application packets experience relative to any capture point on the connection’s path, using the naturally occurring reflected signal that can be obtained when the timestamp option is used in a TCP connection. These delays are collected per TCP connection with outbound packets providing the signal and inbound packets the reflection, and Pping measures the delay of two different round-trips from the monitored packets.

Training Resource

Pluralsight Skills offers thousands of courses, skill assessments, and more on technology topics like software development, cloud, AI/ML, DevOps, and security—and this month, the entire catalog is free.

How to Become a PowerCLI Superhero – 5 Use-Cases to Get Started is a free webinar that will be hosted by two VMware vExperts on April 21. The event is being produced by Hornetsecurity’s Altaro team and will feature live demonstrations of highly useful applications of PowerCLI – a VMware automation tool. All attendees will receive a free copy of a connected 100+ page PowerCLI eBook.

Tutorials

CsPsProtocol offers a collection of simplified tutorials on core technology topics, including networking, programming, telecom, IoT, and more. The helpful content is original and not available elsewhere.

Linux Upskill Challenge is a month-long course for those who want to work in Linux-related jobs. The course focuses on servers and command line, but it assumes essentially no prior knowledge and progresses gently. This valuable content was offered as a paid course in the past but is now free and fully open source.

Published by Lisa Turnbull

, always been a Windows lover since her childhood days. I have always been enthusiastic about emerging technologies, especially Artificial Intelligence (AI), Data Science and Machine Learning. I am working as a freelancer on numerous technical projects.