Knowledge of terms to know
What is Data center capacity planning?
Data center capacity planning is a strategy for ensuring an IT organization’s in-house computing resources, power load, footprint and cooling capacity will be able to meet the workload demands of its users and customers.
Data centers are limited in terms of their footprint, ability to consume power and cooling capacity. Capacity planning software tools can help administrators calculate what resources and how much power draw a data center must support, given current and projected future operations.
Such tools range from simple spreadsheets to 3D renderings of the data center with automated asset discovery and documentation. Some capacity management tools can even automate pre-defined cloudbursting options when in-house power, space and cooling upgrades to the physical site are cost or time prohibitive.
It is usually the data center manager or IT director’s responsibility to chart a capacity plan and determine what strategy will accomodate business needs best. IT service management frameworks like ITIL provide the planner with detailed recommendations for capacity management.
What is Web Mining?
Web mining is the process of using data mining techniques and algorithms to extract information directly from the Web by extracting it from Web documents and services, Web content, hyperlinks and server logs. The goal of Web mining is to look for patterns in Web data by collecting and analyzing information in order to gain insight into trends, the industry and users in general.
Web mining is a branch of data mining concentrating on the World Wide Web as the primary data source, including all of its components from Web content, server logs to everything in between. The contents of data mined from the Web may be a collection of facts that Web pages are meant to contain, and these may consist of text, structured data such as lists and tables, and even images, video and audio.
Categories of Web mining:
- Web content mining: This is the process of mining useful information from the contents of Web pages and Web documents, which are mostly text, images and audio/video files. Techniques used in this discipline have been heavily drawn from natural language processing (NLP) and information retrieval.
- Web structure mining: This is the process of analyzing the nodes and connection structure of a website through the use of graph theory. There are two things that can be obtained from this: the structure of a website in terms of how it is connected to other sites and the document structure of the website itself, as to how each page is connected.
- Web usage mining: This is the process of extracting patterns and information from server logs to gain insight on user activity including where the users are from, how many clicked what item on the site and the types of activities being done on the site.
What is Analytical Engine?
The analytical engine most often refers to a computing machine engineered by Charles Babbage in the early 1800s. It is considered an early and crucial step toward modern computer design. This term can also be used to refer to any comprehensive internal system for analytics.
The Babbage Analytical Engine mostly used analogue systems to implement some of the most basic functions of today’s digital computers. It used a series of punch cards for input and relied on an analogue printer for output. In terms of its capacity, the machine’s use of an arithmetical unit and a primitive CPU that made use of physical pegs and rotating drums enabled the machine to become, in today’s terms, “Turing-complete,” because it would have satisfied the criteria for a Turing machine, which was created in the 1900s.
Although the Babbage Analytical Engine was never physically completed, this early computer would have necessitated quite a lot of physical metal construction and a large footprint of dozens of cubic feet in terms of its overall size. Even so, it was an important precursor to early mainframe computers. More advanced mainframe units still had a lot of analogue assemblies and huge physical footprints compared to today’s digital computers, which accomplish so much more using an electrical impulse. One other interesting fact about the Analytical Engine is its resident memory, which is estimated to have been, according to design, about 16.5 kB.
Eric Zimmerman’s Tools provides several open-source digital forensics tools that can be used in a wide variety of investigations, including cross-validation of tools, providing insight into technical details not exposed by other tools and more. Eric is a SANS Certified Instructor and Former FBI Agent who has written digital forensics tools that investigators use daily all over the world.
The Complete Office 365 and Microsoft 365 Licensing Comparison lays out the dizzying array of Microsoft products in a visual format. This guide is designed to help those who are confused as to exactly which of the overwhelming number of licensing options is the best fit for a particular situation.
iptables Ubuntu Firewall is David Egan’s tutorial on how to configure iptables—a firewall program for controlling server traffic on a Linux system—which is set to allow all traffic by default.
The New Stack provides information on modern architectures, components of the software development lifecycle and operations tools and culture. Articles investigate new technologies and use cases with an eye toward both technical and business perspectives.
Admin Bundle for Active Directory provides three management tools to help with daily, weekly and monthly AD admin tasks. Specific tools include an Inactive User Account Removal Tool, Import User in Bulk and Inactive Computer Removal Utility.
Knowledge of terms to know
What is Real-Time Web Analytics?
Real-time Web analytics is a technology in which the owner/manager of a website has the ability to monitor a website’s users and activities in an instantaneous (or almost instant) fashion. The term encompasses all facets of a website, from visitors, page views, clicks, sales and other metrics, and generally implies immediate updating of dashboards and reporting.
By providing data specifically targeted toward the enhancement of marketing and sales, real-time analytics allows website owners to evaluate user behaviour and interactions, including the number of current visitors, demographics, Web content interaction and referring websites and keywords.
The first generation of Web analytics involved parsing through log files to glean data. This was limited in terms of the amount of data, but also in terms of the real-time nature, in that there was generally a lag due to processing time. Modern technology, such as Google Analytics, allows even the smallest website to have advanced (and real-time) analytics.
What is Load balancing?
Load balancing is a technique used to distribute workloads uniformly across servers or other compute resources to optimize network efficiency, reliability and capacity.
Load balancing is performed by an appliance — either physical or virtual — that identifies in real-time which server in a pool can best meet a given client request, while ensuring heavy network traffic doesn’t unduly overwhelm a single server.
In addition to maximizing network capacity and performance, load balancing provides failover. If one server fails, a load balancer immediately redirects its workloads to a backup server, thus mitigating the impact on end-users.
Load balancing is usually categorized as supporting either Layer 4 or Layer 7. Layer 4 load balancers distribute traffic based on transport data, such as IP addresses and Transmission Control Protocol (TCP) port numbers. Layer 7 load-balancing devices make routing decisions based on application-level characteristics that include HTTP header information and the actual contents of the message, such as URLs and cookies.
What is Real-Time Customer Analytics?
Real-time customer analytics is a type of analytics which concentrates on real-time data captured from customers as they are performing their actions rather than on older historical data like traditional analytics, which use historical data to predict future trends. Real-time customer analytics also gives more emphasis to customer interaction and usage data rather than page views and other similar statistics, giving a more customer-centric view rather than a demographic one.
Real-time customer analytics software and techniques make use of all available enterprise data and other resources, specifically data, to combine both dynamic analysis and reporting based on customer data that has been recorded from the last few minutes. This allows a business to make up-to-the-minute business decisions that may affect the user-base currently using the organization’s systems.
Real-time customer analytics can support instant and up-to-date information in company dashboards as well as reflect business decisions and changes throughout the day. It can also be used to track critical system bugs that are affecting a large portion of an organization’s customers or even predict a DDoS attack based on the activity of the perceived customers. The system can see if the behaviour is similar to that of regular customers or if the group is acting in a very coordinated way, indicating an attack.
What is Customer Analytics?
Customer analytics is an activity within of e-commerce whereby customers’ online shopping and internet search behaviour is examined by software with the results used by teams of marketing professionals looking to increase revenues for online merchants.
Customer analytics uses data collection and subsequent software analysis to zero-in on customers’ online order transactions to sort out specific customer demographics, shopping patterns, internet usage and applying predictive analyses to allow marketers to take measures to increase online business profit margins.
Other commonly used terms for customer analytics include Customer Relationship Management analytics or CRM analytics.
What is Customer Relationship Management Analytics (CRM Analytics)?
Customer relationship management analytics (CRM analytics) refers to applications used to evaluate an organization’s customer data to facilitate and streamline business choices. CRM analytics also may be used for online analytical processing (OLAP) through the use of data mining.
CRM analytical tools use a variety of applications that help measure the effectiveness of customer-related processes and ultimately provide customer categorization, such as profitability analysis, event monitoring, what-if scenarios and predictive modelling.
Through CRM analytics, websites interact with customers more efficiently. This, in turn, makes customer data gathering requirements and opportunities more evident.
CRM analytical tools help in many areas, including:
- To evaluate customer service and satisfaction.
- In the verification of user data
- In the improvement of supply chain management
- By allowing for more aggressive pricing or better pricing policies
A major challenge of CRM analytics may arise from complications related to integrating legacy systems with new systems and analytical software.
What is Business analytics (BA)?
Business analytics (BA) refers to all the methods and techniques that are used by an organization to measure performance. Business analytics are made up of statistical methods that can be applied to a specific project, process or product. Business analytics can also be used to evaluate an entire company. Business analytics are performed to identify weaknesses in existing processes and highlight meaningful data that will help an organization prepare for future growth and challenges.
The need for good business analytics has spurred the creation of business analytics software and enterprise platforms that mine an organization’s data to automate some of these measures and pick out meaningful insights.
Although the term has become a bit of a buzzword, business analytics is a vital part of any business. Business analytics make up a large portion of decision support systems, continuous improvement programs and many of the other techniques used to keep a business competitive. Consequently, accurate business analytics like efficiency measures and capacity utilization rates are the first step to properly implementing these techniques.
Zim is a graphical text editor for maintaining a collection of wiki pages in a folder structure, including attachments. Pages are stored in plain text files with wiki formatting and can contain links to other pages, simple formatting and images. Plugins provide additional functionality, like a task list manager, an equation editor, a tray icon and support for version control.
Useful Registry hacks to optimize your Windows 10 experience shares detailed instructions on how to enable some hidden features in Windows 10 that can provide more information and enhance your experience.
Terminator is a portable Terminal emulator with some nice features, including automatic logging, drag & drop, find horizontal scrolling, unlimited scrollback, multiple tabs and number reinterpretation.
DataKnox Video Channel offers tutorials, tech news and helpful information on NetDevOps, Data Analytics, Automation and Development. Videos are provided by Knox Hutchinson, who shares details on some of the approaches that will and won’t work for network & systems engineers, cloud architects, software & web developers, business intelligence pros, help desk directors and other tech pros.
OBS Studio is an open-source tool for video recording and live streaming. Allows high-performance realtime video/audio capture and mixing for Windows, Mac and Linux.