News & Information       http://info.owt.com

Linux

04/19/2024   LinuxSecurity.com
This update includes httpd version 2.4.59, fixing various security issues and bugs. See https://downloads.apache.org/httpd/CHANGES_2.4.59 for complete details of the changes in this release.
04/19/2024   LinuxSecurity.com
New upstream release (125.0) New upstream release (124.0.2)
04/19/2024   LinuxSecurity.com
update to 123.0.6312.122 * High CVE-2024-3157: Out of bounds write in Compositing * High CVE-2024-3516: Heap buffer overflow in ANGLE * High CVE-2024-3515: Use after free in Dawn update to 123.0.6312.105
04/19/2024   LinuxSecurity.com
Update llhttp to 9.2.1, fixing CVE-2024-27982. Backport llhttp 9.2.1 support to python-aiohttp 3.9.3.
04/19/2024   LinuxSecurity.com
2024-04-10, Version 18.20.2 'Hydrogen' (LTS), @RafaelGSS This is a security release. Notable Changes CVE-2024-27980 - Command injection via args parameter of child_process.spawn without shell option enabled on Windows
04/19/2024   LinuxSecurity.com
Update to 0.29 - Fixes 'clipbrowse command execution with multi-line clipboard text including "| sh"'
04/18/2024   Linux Today

In this guide, learn how to debloat (or swap) Ubuntu system to a cleaner version by removing unnecessary packages and installing new packages as a DEB.

The post How to Debloat (or Swap) an Ubuntu System (Ultimate Guide) appeared first on Linux Today.

04/18/2024   Linux Today

A web application firewall (WAF) is a type of application firewall that lets you see and analyze HTTP traffic to and from a web application. It has the objective of preventing attacks that seek to deny service and steal data.

The post 6 Best Free and Open-Source Web Application Firewalls appeared first on Linux Today.

04/18/2024   Linux Today

Steven Shiau released today a new version of his Clonezilla Live system for disk cloning/imaging based on the powerful Clonezilla software, Clonezilla Live 3.1.2-22, which adds a newer kernel and patches the live system against the latest XZ backdoor.

The post Clonezilla Live Is Now Patched Against the XZ Backdoor, Powered by Linux 6.7 appeared first on Linux Today.

04/18/2024   Linux Today

Exciting updates for Pop!_OS and the COSMIC desktop environment, including new features and performance boosts. Here’s more on that!

The post Pop!_OS’s COSMIC Pre-Alpha Shows Impressive Progress appeared first on Linux Today.

04/18/2024   Linux Today

Volla, the maker of the Volla Phone smartphones, has launched a crowdfunding campaign on Kickstarter for their first tablet device, the Volla Tablet, which will also support the Ubuntu Touch mobile OS.

The post Volla Tablet Launches on Kickstarter with Support for Ubuntu Touch appeared first on Linux Today.

04/18/2024   Linux Journal
Rebuilding and Modifying Debian Packages

Introduction

The Debian packaging system is an integral part of managing software on Debian and its derivatives like Ubuntu. It facilitates the installation, upgrade, and removal of software packages. Sometimes, however, the available binary packages may not meet all the specific requirements of a user or organization, be it due to the need for a patched version, additional features, or customization for performance optimizations. This article delves deep into the process of rebuilding and modifying existing Debian packages, offering a guide for those who need to tailor software packages to their precise requirements.

Understanding Debian Packages

Debian packages, with the .deb file extension, are archives that contain compiled binaries, configuration files, and installation scripts. Understanding the internal structure of these packages is critical. A typical .deb package includes:

  • DEBIAN directory: Holds control files that manage how the package is installed, upgraded, or removed.
  • data archive: Contains the actual files of the package.
Common Files in a Debian Package
  • control: Details package dependencies and metadata like version, maintainer, and description.
  • changelog: Records all the changes made to the package.
  • rules: Specifies how the package is to be built from its source.
Debian Packaging Tools
  • dpkg: The base tool that handles package installation and removal.
  • APT (Advanced Package Tool): Works at a higher level to handle the retrieval and installation of packages from remote repositories.
  • dpkg-dev: A collection of tools necessary to build Debian packages.

Why Modify a Debian Package?

Customizing software can optimize operational efficiency, enhance security, and add or modify features to suit better the unique environment in which they operate. Typical reasons for modifying packages include:

  • Customization: Adjusting software to fit specific local policies or performance requirements.
  • Security patches: Quickly applying security patches that are not yet part of official releases.
  • Functional updates: Adding features not available in the standard package.

Preparing the Environment

Installing Necessary Tools

Before beginning, ensure your system has the tools required for Debian package development installed:

sudo apt-get install dpkg-dev devscripts build-essential fakeroot

04/18/2024   Linux Today

Linux dodged a bullet. If the XZ exploit had gone undiscovered for only a few more weeks, millions of Linux systems would have been compromised with a backdoor. We were lucky. But can we stay lucky? The Open Source Security Foundation (OpenSF) and the OpenJS Foundation revealed that a similar hacking attempt had targeted several […]

The post OpenSSF Warns of Open-Source Social Engineering Threats appeared first on Linux Today.

04/18/2024   Linux Today

MPV 0.38.0 video player rolls out with new scripting options, enhanced macOS functionality, and Vulkan support.

The post MPV 0.38.0 Launches with Advanced Video & Audio Upgrades appeared first on Linux Today.

04/18/2024   Linux Today

After recently patching a security bug that Red Hat decided not to patch, the RHEL clone AlamaLinux has now returned support for many devices that are no longer supported by RHEL in beta releases of AlmaLinux 9.4 and 8.10.

The post AlmaLinux’s New Beta Releases Return Support for Devices Dropped by RHEL appeared first on Linux Today.

04/17/2024   Linux Today

OpenProject is an open-source task management software written in Ruby on Rails and Angular JS. This tutorial will show you how to install OpenProject on Ubuntu 22.04.

The post How to Install OpenProject on Ubuntu 22.04 appeared first on Linux Today.

04/17/2024   Linux Today

Odoo, previously recognized as OpenERP, is a collection of open-source business applications. It is a highly utilized open-source enterprise resource planning (ERP) software. Odoo provides a variety of modules that can be installed within a single application, contributing to its current popularity. In this guide, we will demonstrate the process of how you can install […]

The post How to Install Odoo 17 on Ubuntu 24.04 appeared first on Linux Today.

04/16/2024   Linux Journal
Understanding Backup and Disaster Planning Solutions for Linux

In today's digital age, the reliability and integrity of computer systems are more critical than ever. For Linux systems, which are widely used in servers and critical infrastructure, ensuring rapid recovery from failures is paramount. This article dives deep into the essentials of Linux system recovery, offering insights into effective backup strategies and disaster planning. By understanding these elements, users and administrators can safeguard their systems against potential disasters, ensuring continuity and security.

Understanding the Basics of Linux System Recovery

System recovery involves restoring a computer system to an operational state following a failure. In the context of Linux, this means having the ability to bring back data, configurations, and the operating system itself after incidents like hardware malfunctions, software corruption, human errors, or natural disasters.

Types of Failures Affecting Linux Systems

Linux systems, robust as they are, can still fall prey to various types of failures:

  • Hardware Failures: These include issues like hard drive crashes, memory corruption, or power supply failures.
  • Software Issues: Software failures may involve bugs, accidental deletion of critical files, or system misconfigurations.
  • Human Error: Often overlooked, human error such as incorrect commands or improper handling of data can lead to significant disruptions.
  • Natural Disasters: Events like floods, earthquakes, or fires can cause physical damage to systems, necessitating robust disaster recovery plans.

Backup Strategies for Linux Systems

A sound backup strategy is the cornerstone of effective system recovery. Here’s how you can approach backing up your Linux systems:

Incremental vs. Full Backups
  • Incremental Backups save changes made since the last backup, conserving storage space and reducing backup time. However, recovery can be slower as it may require a series of incremental backups to restore the latest state.
  • Full Backups involve copying all data to the backup storage. They require more storage space and longer to complete but make recovery fast and straightforward.

Choosing between these methods depends on your specific needs regarding recovery time objectives (RTO) and recovery point objectives (RPO).

04/11/2024   Linux Journal
How to Build Resilience with Linux High Availability Clustering

Introduction

In the age of digital transformation, the uptime and continuous availability of systems are paramount for businesses across all sectors. High Availability (HA) clustering has emerged as a critical strategy for ensuring that services remain accessible, even in the face of hardware or software failures. Linux, with its robustness and flexibility, serves as an ideal platform for deploying HA solutions. This article delves into the concept of Linux High Availability Clustering, exploring its mechanisms, technologies, and the vital role it plays in building resilient and fault-tolerant systems.

Concept of Clustering

At its core, a cluster is a group of interconnected computers that work together as a single system to provide higher levels of availability, reliability, and scalability. Unlike standalone servers, clusters are designed to manage failures seamlessly and ensure that services are not disrupted. Clustering can be categorized primarily into two types: Active-Active and Active-Passive.

  • Active-Active clusters involve multiple nodes all handling requests simultaneously. This not only provides redundancy but also enhances the performance of the system by distributing the load.
  • Active-Passive clusters, on the other hand, consist of active nodes and standby nodes where the standby nodes only come into play if the active ones fail.

The components of a Linux HA cluster typically include hardware nodes, networking, storage, clustering software, and applications configured to run on the cluster.

Key Technologies and Tools in Linux HA Clustering

Linux HA clustering leverages several tools and technologies to ensure system availability:

  • Pacemaker: An open-source cluster resource manager that handles the allocation of resources (such as virtual IPs, web servers, and databases) according to predefined policies in the event of node or resource failures.
  • Corosync: Provides the messaging layer for Linux clustering solutions, ensuring all nodes in the cluster maintain constant communication and are aware of each other's status.
  • DRBD (Distributed Replicated Block Device): Facilitates the replication of data across storage devices in real-time, ensuring data redundancy.
  • Linux Virtual Server (LVS): Manages load balancing and delivers scalability across clustered server nodes.

Architecture of Linux HA Clusters

The architecture of an HA cluster in Linux environments can vary based on requirements but generally involves several key components:

04/09/2024   Linux Journal
Harnessing the Power of Open Source for Private Clouds: Ubuntu Cloud Infrastructure with OpenStack

In the ever-evolving landscape of technology, cloud computing has emerged as a cornerstone, enabling businesses and individuals alike to leverage vast computing resources without the need for extensive physical infrastructure. Among the various flavors of cloud computing, private clouds offer a tailored, secure, and controlled environment, often making them the choice for organizations with stringent data control, privacy, and compliance requirements. This article delves into how Ubuntu Cloud Infrastructure, in conjunction with OpenStack, provides a robust foundation for setting up private cloud environments, blending flexibility, scalability, and security.

Introduction to Cloud Computing

Cloud computing has revolutionized the way we think about IT resources. It refers to the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. The main categories of cloud computing include public clouds, private clouds, and hybrid clouds, each serving different needs and purposes. Private clouds, the focus of our discussion, are cloud environments exclusively used by one business or organization, offering greater control and privacy.

Understanding Ubuntu Cloud Infrastructure

Ubuntu Cloud Infrastructure represents Canonical's commitment to providing a seamless, flexible, and scalable cloud computing experience. It is an integrated cloud infrastructure package that enables businesses to build cloud services within their firewall, with a special emphasis on ease of deployment, management, and maintenance. Ubuntu, known for its stability and security, brings these attributes to the cloud, making it an ideal choice for enterprises looking to deploy their private clouds.

Introduction to OpenStack

OpenStack is an open-source platform for cloud computing, mostly deployed as infrastructure-as-a-service (IaaS), allowing users to control large pools of compute, storage, and networking resources throughout a data center. It's managed by the OpenStack Foundation, a non-profit corporate entity established to promote OpenStack and its community. OpenStack's modular architecture ensures flexibility and enables integration with a broad range of software and hardware.

Ubuntu Cloud Infrastructure with OpenStack for Private Clouds

The combination of Ubuntu and OpenStack for deploying private clouds is a match made in heaven for several reasons. Ubuntu serves as the most popular operating system on OpenStack deployments, thanks to its reliability and the comprehensive support provided by Canonical. Together, they offer a powerful platform for building private clouds that can efficiently handle the demands of modern enterprise workloads.

04/04/2024   Linux Journal
Text Manipulation in Linux: Awk Vs. Sed

The Linux operating system is a powerhouse for developers, system administrators, and enthusiasts alike, offering unparalleled flexibility and control. Central to its prowess is the command line, a potent interface through which users can perform intricate operations with just a few keystrokes. Among the myriad of command-line tools available, awk and sed stand out for their text processing capabilities. These tools, though distinct in their functionalities, can be incredibly powerful when used independently or in conjunction. This article delves deep into both, unraveling their complexities, comparing their functionalities, and guiding users on when and how to use them effectively.

Understanding Awk: The Text Processing Powerhouse

awk is more than just a command-line tool; it's a full-fledged programming language designed for pattern scanning and processing. It shines in tasks that involve scanning files, extracting parts of the data, and performing actions on that data. The beauty of awk lies in its simplicity for basic tasks, yet it scales to accommodate complex programming logic for more advanced needs.

The Structure of an Awk Command

An awk command typically follows this structure: awk 'pattern { action }' input-file. The pattern specifies when the action should be performed. If the pattern matches, the corresponding action is executed. This structure allows awk to sift through lines of text, searching for those that meet the criteria specified in the pattern, and then execute operations on those lines.

Key Features of Awk
  • Built-in Variables: awk offers variables like NR (number of records), NF (number of fields in the current record), and FS (field separator), which are instrumental in text processing tasks.
  • Patterns and Actions: Users can specify patterns to match and actions to execute when a match is found, making awk highly versatile.
  • Associative Arrays: Unlike traditional arrays, associative arrays allow indexing using strings, facilitating complex data manipulation.

Demystifying Sed: The Stream Editor

While awk is celebrated for its processing capabilities, sed specializes in transforming text. sed is a stream editor, meaning it performs basic text transformations on an input stream (a file or input from a pipeline). It is renowned for its efficiency in editing files without opening them.

04/02/2024   Linux Journal
Best Practices and Strategic Insights to Dockerizing Your Linux Applications

In the realm of software development and deployment, Docker has emerged as a revolutionary force, offering a streamlined approach to creating, deploying, and running applications by using containers. Containers allow developers to package up an application with all the parts it needs, such as libraries and other dependencies, and ship it all out as one package. This guide delves deep into the world of Dockerizing applications on Linux, covering best practices, deployment strategies, and much more to empower developers and DevOps professionals alike.

Understanding Docker and Containerization

Docker is a platform that utilizes OS-level virtualization to deliver software in packages called containers. Containers are isolated from one another and bundle their own software, libraries, and configuration files; they can communicate with each other through well-defined channels. Unlike traditional virtual machines, containers do not bundle a full operating system — just the application and its dependencies. This makes them incredibly lightweight and efficient.

The Benefits of Docker
  • Consistency across Environments: Docker containers ensure that applications work seamlessly in any environment, from a developer's personal laptop to the production server.
  • Isolation: Applications in Docker containers run in isolated environments, reducing conflicts between applications and between applications and the host system.
  • Resource Efficiency: Containers share the host system kernel and start much faster than VMs. They also require less compute and memory resources.
  • Scalability and Modularity: Docker makes it easy to break down applications into microservices, making them easier to scale and update.

Setting Up Docker on Linux

The process to install Docker varies depending on the Linux distribution. For Ubuntu, for instance, Docker can be installed with just a few commands:

sudo apt update sudo apt install docker.io sudo systemctl start docker sudo systemctl enable docker

After installation, verify that Docker is running by executing sudo docker run hello-world. This command pulls a test image from Docker Hub and runs it in a container, which prints a message.

Dockerizing Applications: Best Practices

Creating Efficient Dockerfiles

A Dockerfile is a script containing a series of commands and instructions to build a Docker image. The key to an efficient Dockerfile is minimizing the build time and the size of the image.

03/28/2024   Linux Journal
Mastering Linux Disk Management: LVM and Disk Partitioning

Linux stands as a bastion of power and flexibility in the world of operating systems, particularly when it comes to managing disk storage. Whether you're a seasoned sysadmin, a developer, or a Linux enthusiast, understanding how to efficiently manage disk space is crucial. This guide delves into the intricacies of Disk Partitioning and Logical Volume Management (LVM), equipping you with the knowledge to optimize your Linux system's storage.

Understanding Disk Partitioning

Disk Partitioning is the first step towards organizing the storage on a disk. It involves dividing a disk into separate sections, each functioning as an independent disk, which can be managed separately. This segregation helps in managing files, running different operating systems on the same disk, or creating a dedicated space for specific data.

Types of Disk Partitions

There are three main types of partitions:

  • Primary Partitions: Directly accessible and used for booting the system. A disk can have up to four primary partitions.
  • Extended Partitions: Created within a primary partition, acting as a container that can hold multiple logical partitions. This is a workaround for the four-partition limit.
  • Logical Partitions: Nested within an extended partition, allowing for more than four partitions on a disk.
File Systems and Their Importance

A file system dictates how data is stored and retrieved. Each partition can use a different file system (ext4, NTFS, FAT32, etc.), affecting performance, storage efficiency, and compatibility.

Tools for Disk Partitioning in Linux

Linux offers a plethora of tools for disk partitioning, including:

  • fdisk: A command-line utility ideal for MBR disks.
  • gdisk: Similar to fdisk but for GPT disks.
  • parted: A versatile tool that supports both MBR and GPT disks.

The Basics of Logical Volume Management (LVM)

LVM is a more flexible approach to managing disk space. It allows for resizing partitions (logical volumes) on the fly, creating snapshots, and combining multiple physical disks into one large virtual one.

03/26/2024   Linux Journal
Crafting Minimal Ubuntu Images for Embedded Brilliance

Introduction

In the vast and evolving landscape of technology, embedded systems stand as silent yet powerful pillars supporting an array of applications, from the simplicity of a digital watch to the complexity of autonomous vehicles. These dedicated computing behemoths often operate within constrained environments, necessitating an operating system that is not just robust but also refined in its minimalism. Enter Ubuntu, a versatile and widely acclaimed Linux distribution, which emerges as an unexpected yet fitting candidate for this purpose. This article delves into the art of constructing minimal Ubuntu images tailored for the unique demands of embedded systems, illuminating the pathway towards enhanced performance, fortified security, and streamlined maintenance.

Understanding the Core of Minimalism in Embedded Systems

Embedded systems are intricately designed to perform specific tasks, where every millisecond of processing time and every byte of memory counts. In such a landscape, Ubuntu, known for its user-friendly approach and comprehensive support, may not seem like the obvious choice. However, its adaptability and the vast repository of packages make Ubuntu a prime candidate for customization into a lean operating system footprint suitable for embedded applications. The quest for minimalism isn't merely about shedding weight; it's about achieving the pinnacle of efficiency and security.

The Pillars of Performance Enhancement

A minimal Ubuntu image, stripped of unnecessary packages and services, boots faster and runs more efficiently, allowing embedded systems to dedicate more resources to their primary functions. This streamlined approach ensures that embedded devices can operate within their limited computational and memory capacities without compromising on their core functionalities.

The Fortress of Security

In the realm of embedded systems, where devices often operate in critical and sometimes inaccessible environments, security is paramount. A minimal Ubuntu image inherently possesses fewer vulnerabilities, as each removed package eliminates potential entry points for attackers. This minimalistic approach not only secures the device but also simplifies compliance with stringent security standards.

The Ease of Updates and Maintenance

Maintaining embedded systems, particularly those deployed in remote or challenging locations, can be daunting. Minimal Ubuntu images, with their reduced complexity, offer a more manageable solution. Updates are quicker and less intrusive, minimizing system downtime and reducing the risk of update-induced failures.

03/21/2024   Linux Journal
Linux Version Odyssey: Navigating Through Time and Technology

Linux, the cornerstone of modern computing, powers everything from tiny embedded devices to the world's most formidable supercomputers. Its open-source nature has fostered a rich ecosystem of distributions (distros), each tailored to different needs and preferences. However, this diversity also introduces complexity, especially when it comes to managing different versions of Linux over time. This article will navigate you through the labyrinth of past, present, and future Linux versions, equipping you with the knowledge to manage and utilize these systems effectively.

Understanding Linux Versioning

Linux versioning might seem daunting at first glance, but it follows a logical structure once understood. Major Linux distributions like Ubuntu, Fedora, and CentOS have their own versioning schemes, typically involving a mix of numbers and, sometimes, names. For example, Ubuntu versions are numbered based on the year and month of release (e.g., Ubuntu 20.04 was released in April 2020), and LTS (Long Term Support) versions are released every two years, offering five years of support.

Navigating Past Linux Versions

Older versions of Linux distros often face compatibility issues with newer hardware, limiting their functionality. Additionally, as software evolves, applications may no longer support outdated versions, complicating tasks that require up-to-date software. Moreover, security is a significant concern; older, unsupported versions do not receive security updates, exposing systems to vulnerabilities.

Maintaining legacy systems securely requires a strategic approach. One can isolate these systems from the internet or use them in a controlled environment. Furthermore, communities and special-interest groups often support older versions, providing patches or advice on managing these systems.

Embracing Current Linux Versions

Regular updates are crucial for security and performance. Most Linux distros offer simple commands or graphical interfaces to check and apply updates, ensuring your system is protected and efficient. Transitioning between versions, although daunting, is made manageable through guides provided by most distributions, detailing steps to upgrade without losing data.

Transitioning requires careful planning. Always back up your data before upgrading. Understand the changes and new features introduced in the new version to adapt quickly and leverage improvements.

Preparing for Future Linux Versions

Staying informed about upcoming releases allows users to anticipate changes and prepare accordingly. Engaging with Linux communities and news sources can provide insights into future developments. Additionally, participating in beta testing offers a glimpse into new features and the opportunity to contribute to the Linux ecosystem.