Skip to main content

Posts

UNIX-File Permission / Access Modes

File ownership is an important component of Unix that provides a secure method for storing files. Every file in Unix has the following attributes − Owner permissions − The owner's permissions determine what actions the owner of the file can perform on the file. Group permissions − The group's permissions determine what actions a user, who is a member of the group that a file belongs to, can perform on the file. Other (world) permissions − The permissions for others indicate what action all other users can perform on the file. The Permission Indicators While using ls -l command, it displays various information related to file permission as follows − $ls -l /home/amrood -rwxr-xr-- 1 amrood users 1024 Nov 2 00:10 myfile drwxr-xr--- 1 amrood users 1024 Nov 2 00:10 mydir Here, the first column represents different access modes, i.e., the permission associated with a file or a directory. The permissions are broken into groups of threes, and each pos...

UNIX- Directory Management & File System Basics

A directory is a file the solo job of which is to store the file names and the related information. All the files, whether ordinary, special, or directory, are contained in directories. Unix uses a hierarchical structure for organizing files and directories. This structure is often referred to as a directory tree. The tree has a single root node, the slash character ( / ), and all other directories are contained below it. Home Directory The directory in which you find yourself when you first login is called your home directory. You will be doing much of your work in your home directory and subdirectories that you'll be creating to organize your files. You can go in your home directory anytime using the following command − $cd ~ $ Here ~ indicates the home directory. Suppose you have to go in any other user's home directory, use the following command − $cd ~username $ To go in your last directory, you can use the following command − $cd - $ Absolu...

UNIX-File Management

All data in Unix is organized into files. All files are organized into directories. These directories are organized into a tree-like structure called the filesystem. When you work with Unix, one way or another, you spend most of your time working with files. This tutorial will help you understand how to create and remove files, copy and rename them, create links to them, etc. In Unix, there are three basic types of files − Ordinary Files − An ordinary file is a file on the system that contains data, text, or program instructions. In this tutorial, you look at working with ordinary files. Directories − Directories store both special and ordinary files. For users familiar with Windows or Mac OS, Unix directories are equivalent to folders. Special Files − Some special files provide access to hardware such as hard drives, CD-ROM drives, modems, and Ethernet adapters. Other special files are similar to aliases or shortcuts and enable you to access a single file usin...

UNIX

Unix is a computer Operating System which is capable of handling activities from multiple users at the same time. The development of Unix started around 1969 at AT&T Bell Labs by Ken Thompson and Dennis Ritchie. This tutorial gives a very good understanding on Unix. What is Unix ? The Unix operating system is a set of programs that act as a link between the computer and the user. The computer programs that allocate the system resources and coordinate all the details of the computer's internals is called the operating system or the kernel . Users communicate with the kernel through a program known as the shell . The shell is a command line interpreter; it translates commands entered by the user and converts them into a language that is understood by the kernel. Unix was originally developed in 1969 by a group of AT&T employees Ken Thompson, Dennis Ritchie, Douglas McIlroy, and Joe Ossanna at Bell Labs. There are various Unix variants available in the m...

The New Face of War: Attacks in Cyberspace

War continues to spread online. Known as cyberwarfare, the spread of malicious online viruses just may be the future of war. Cyber attacks continue to grow in number and sophistication each year. In 2006, Russian Mafia group Russian Business Network (RBN) began using malware for identity theft. By 2007, RBN completely monopolized online identity theft. By September 2007, their Storm Worm was estimated to be running on roughly one million computers, sending millions of infected emails each day. In 2008, cyber attacks moved from personal computers to government institutions. On August 27, 2008 NASA confirmed a worm had been found on laptops in the International Space Station; three months later Pentagon computers were hacked, allegedly by Russian hackers. Financial institutions were next. The State Bank of India (India'’s largest bank) was attacked by hackers located in Pakistan on December 25, 2008. While no data was lost, the attack forced SBI to temporarily shut do...

Cybersecurity Challenges

USB encryption Almost half the respondents were lacking when it came to USB encryption. They failed to ensure that data from a device connecting to end points via USB was sufficiently encrypted, were it to end up in an unsecured or hostile environment. Third party device connectivity Some 35% of organizations aren’t controlling end point connectivity solutions like SD cards, Bluetooth, and Fire Wire, to limit the threats they potentially bring. USB control USB devices can be a significant vector for the distribution of cyber attacks. However, over 35% of respondents don’t control or limit any device connecting to end points via USB. Data loss prevention Some 37% of companies have no assurance against loss of information, documents, and IP. Reverse engineering of malware Only 39% of organizations are actively working on reverse engineering of malware, while 32% are still in an initial phase of developing this. Emergency response team Only 16% o...

Artificial Neural Networks and Analogy to the Brain

What are Artificial Neural Networks? Artificial Neural Networks are relatively crude electronic models based on the neural structure of the brain. The brain basically learns from experience. It is natural proof that some problems that are beyond the scope of current computers are indeed solvable by small energy efficient packages. This brain modeling also promises a less technical way to develop machine solutions. This new approach to computing also provides a more graceful degradation during system overload than its more traditional counterparts. These biologically inspired methods of computing are thought to be the next major advancement in the computing industry. Even simple animal brains are capable of functions that are currently impossible for computers. Computers do rote things well, like keeping ledgers or performing complex math. But computers have trouble recognizing even simple patterns much less generalizing those patterns of the past into actions of the future. ...