Raspberry Pi has launched a new device called Raspberry Pi 400, which is basically a computer inside a keyboard. This is similar to devices like Commodore 64 from the 80s. #RaspberryPi #RaspberryPi400
PortQuiz is a handy outgoing port tester that listens on all TCP ports, allowing you to test any outbound port.
Lithnet Password Protection provides enhanced options for ensuring all Active Directory accounts have strong passwords. You can customize what checks you want to perform as users attempt to change their passwords.
IPvZero Video Channel provides many great technical training videos on networking, network automation and cybersecurity.
Fiddler Everywhere is a web debugging proxy for macOS, Windows and Linux. Allows you to capture, inspect and monitor all HTTP(S) traffic between your computer and the Internet, mock requests and diagnose network issues. Fiddler Everywhere can be used with any browser, application and process.
WinMerge is an open-source differencing and merging tool for Windows. It compares folders and files, presenting differences in a visual text format that is easy to understand and handle. WinMerge can help you see what has changed between project versions, and then merge changes between versions. Can be used as an external tool or a standalone application.
ctrl + space will give you a list of autocomplete commands instead of having to cycle through them one by one
Knowledge of terms to know
What is Predictive Analytics Tools?
Predictive analytics tools provide business owners with a way to predict how their customers and potential audiences are reacting to promotions and other strategies or campaigns that the businesses might be running. Predictive analytics tools work by providing well-tracked historical data and real-time insights.
Predictive analytics tools provide insights into customer activities in real-time. This helps business owners in revising and planning their strategies to gain maximum return on inputs. These tools predict various behaviour and patterns, like where more storage is needed, how to keep users engaged or which campaigns yielded better results. These tools make their predictions based on a historical analysis of data collected. There are various predictive analytics tools available on the market.
What is Predictive Modelling?
Predictive modelling is a process through which a future outcome or behaviour is predicted based on the past and current data at hand.
It is a statistical analysis technique that enables the evaluation and calculation of the probability of certain results related to software, systems or an entire IT environment.
Predictive modelling works by collecting data, creating a statistical model and applying probabilistic techniques to guess/predict the likely outcome. In IT, predictive modelling is used to evaluate and identify future trends related to a specific technology domain. For example, software usage statistics can be analyzed to predict future use trends. Moreover, predictive modelling is used on live systems to evaluate and make changes to the underlying system when fulfilling user and business demands.
What is Windows Autopilot?
Windows Autopilot is a desktop provisioning tool native to Windows 10 that allows IT professionals to automate image deployment of new desktops with preset configurations.
Windows Autopilot can simplify the out-of-box experience (OOBE) for new desktop users in an organization. With Windows Autopilot, IT professionals can set new desktops to join pre-existing configuration groups and apply profiles to the desktops so new users can access fully functional desktops from their first logon.
To use Windows Autopilot, IT must connect the devices to a Microsoft Azure portal and enroll them in Microsoft Windows Azure Active Directory. Once IT enrolls the device or devices, it can assign a desktop image to each user before users register their devices. With the images in place, the final step is for users to log on and input their company credentials for identity verification.
What is Canary Test (Canary Deployment)?
A canary is a code release that is tested in a post-production environment by a small subset of end users.
Canaries may also be referred to as canary tests or yellow deployments.
Canary tests, which are usually automated, are run after testing in a sandbox environment has been completed.
Like the canary in a coal mine, the end-user who is selected to receive new code in a canary test is unaware he or she is being used to provide an early warning.
Canary tests help software developers uncover new, unexpected scenarios that may not have been detected within the development environment. Because a canary is only distributed to a small number of people, its impact is relatively small and changes can be reversed quickly if code changes slow performance or cause an application to crash.
What is Predictive Model Markup Language (PMML)?
Predictive Model Markup Language (PMML) is an XML-based markup language designed to provide a method of defining application models related to predictive analytics and data mining. PMML attempts to eliminate proprietary issues and incompatibility from application exchange models.
PMML effectively resolves increased requirements for statistical/data mining tools and technology within business systems. PMML includes many built-in functions, including condition statements and arithmetic capabilities, which enable broad data manipulation. Specific elements are defined for pre-processing tasks in PMML, e.g., normalization, discretization and value mapping.
What is Bi-Directional Predictive Frame (B-Frame)?
A bi-directional predictive frame (B-Frame) is part of an MPEG video compression standard. In this method, groups of sequential pictures are aggregated to form a group of pictures (GOP), which are displayed in sequence to provide video. A single bi-directional predictive frame relates to other frames directly preceding or following it.
By recording just the information that differs from a preceding picture or a following picture, the data storage requirements for each individual picture become much lower than in a technique that would store each successive image completely.
A bi-directional predictive frame may also be known as a bi-directional frame.
Using other factors like slices per frame, experts are able to come up with more precise compression ratios for these types of MPEG files. New techniques such as temporal trajectory filtering can also help. New MPEG standards are maintained by the Motion Picture Experts Group, a creation of the International Organization for Standardization and the International Electrotechnical Commission.
What is String Theory?
String theory, also called superstring theory, is a controversial theory in physics that seeks to reconcile the differences between quantum theory and the theory of relativity. The goal of string theory is to explain the nature of all known forces and matter.
Superstring theory has as its basic premise the belief that all matter and forces of nature are simply different manifestations of a single essence that consists of strings that vibrate and particle differences can be attributed to variations in how strings vibrate.
Superstring theory came to the forefront in the 1980s, when Michael Green at Queen Mary College and John Schwarz at the California Institute of Technology demonstrated that it had the potential to be the unifying theory that Einstein sought: one that could be used to mathematically describe gravity as well as electromagnetic forces.
If string theory turns out to be the unifier scientists have been looking for, it is expected to help computer engineers create efficient quantum error-correcting code, an important consideration in quantum computing.
What is Precision Agriculture (PA)?
Precision agriculture (PA) is an approach to farm management that uses information technology to ensure that crops and soil receive exactly what they need for optimum health and productivity. The goal of PA is to ensure profitability, sustainability and protection of the environment.
PA, which is also known as satellite agriculture, relies upon the farmer’s ability to access real-time data about the conditions of the crops, soil and ambient air, along with other relevant information such as hyper-local weather predictions, labor costs and equipment availability.
Sensors in fields can measure the moisture content and temperature of the soil and surrounding air. Satellites and robotic drones provide farmers with real-time images of individual plants. Information from those images can be processed and integrated with sensor and other data to yield guidance for immediate and future decisions, such as precisely what fields to water and when or where to plant a particular crop.
Predictive analytics software uses the data to provide farmers with guidance about crop rotation, optimal planting times, harvesting times and soil management.
What is Data Modelling?
Data modelling is a representation of the data structures in a table for a company’s database and is a very powerful expression of the company’s business requirements. This data model is the guide used by functional and technical analysts in the design and implementation of a database.
Data models are used for many purposes, from high-level conceptual models to physical data models.
Data modelling explores data-oriented structures and identifies entity types. This is unlike class modelling, where classes are identified.
Three basic styles of data modelling are generally used in practice today.
- Conceptual Data Models: High-level, static business structures and concepts
- Logical Data Models (LDMs): Entity types, data attributes and relationships between entities
- Physical Data Models (PDMs): The internal schema database design
What is Data Mining?
Data mining is the process of analyzing hidden patterns of data according to different perspectives for categorization into useful information, which is collected and assembled in common areas, such as data warehouses, for efficient analysis, data mining algorithms, facilitating business decision making and other information requirements to ultimately cut costs and increase revenue.
Data mining is also known as data discovery and knowledge discovery.
What is NetOps (Network Operations)?
NetOps (network operations) is a blended approach to network management that allows an enterprise to respond quickly to new requests and events while minimizing manual intervention.
Historically, enterprise networking teams have provisioned, deployed and managed infrastructure manually. As a result, many networks are overly fragile and complex and require unique fixes when broken. When each component requires a high degree of human attention and input, scalability is inherently limited. Automating the re-use of established provisioning, configuration and deployment policies makes network infrastructure management more consistent and improves both performance and security.
NetOps also integrates network analytics tools to flag potential performance issues, gather and present relevant data and trigger automated remediation processes. Intent-based networking (IBN), a separate network initiative that relies heavily on automation, is an additional component of NetOps.
The evolution of NetOps is making it more critical for network professionals to possess basic automation and programming skills, as well as the soft skills needed to collaborate proactively with other teams and users.
Phantom Analyzer is a simple, real-time website scanner to see what invisible creepers are lurking in the shadows and collecting information about you.