deodatabase

Tuesday, September 09, 2008

Cell biology

Cell biology (also called cellular biology or formerly cytology, from the Greek kytos, "container") is an academic discipline that studies cells – their physiological properties, their structure, the organelles they contain, interactions with their environment, their life cycle, division and death. This is done both on a microscopic and molecular level. Cell biology research extends to both the great diversity of single-celled organisms like bacteria and the many specialized cells in multicellular organisms like humans.

Knowing the composition of cells and how cells work is fundamental to all of the biological sciences. Appreciating the similarities and also differences between cell types is particularly important to the fields of cell and molecular biology. These fundamental similarities and differences provide a unifying theme, allowing the principles learned from studying one cell type to be extrapolated and generalized to other cell types. Research in cell biology is closely related to genetics, biochemistry, molecular biology and developmental biology.

Monday, August 25, 2008

Enzyme kinetics

Enzyme kinetics is the study of the chemical reactions that are catalysed by enzymes, with a focus on their reaction rates. The study of an enzyme's kinetics reveals the catalytic mechanism of this enzyme, its role in metabolism, how its activity is controlled, and how a drug or a poison might inhibit the enzyme.

Enzymes are usually protein molecules that manipulate other molecules — the enzymes' substrates. These target molecules bind to an enzyme's active site and are transformed into products through a series of steps known as the enzymatic mechanism. These mechanisms can be divided into single-substrate and multiple-substrate mechanisms. Kinetic studies on enzymes that only bind one substrate, such as triosephosphate isomerase, aim to measure the affinity with which the enzyme binds this substrate and the turnover rate.

Thursday, August 21, 2008

Network telescope

A network telescope (also known as a darknet, internet motion sensor or black hole) is an internet system that allows one to observe different large-scale events taking place on the Internet. The basic idea is to observe traffic targeting the dark (unused) address-space of the network. Since all traffic to these addresses is suspicious, one can gain information about possible network attacks (random scanning worms, and DDoS backscatter) as well as other misconfigurations by observing it.

The resolution of the Internet telescope is dependent on the number of dark addresses it monitors. For example, a large Internet telescope that monitors traffic to 16,777,216 addresses (a /8 Internet telescope in IPv4), has a higher probability of observing a relatively small event than a smaller telescope that monitors 65,536 addresses (a /16 Internet telescope).

A variant of a network telescope is a sparse darknet, or greynet, consisting of a region of IP address space that is sparsely populated with 'darknet' addresses interspersed with active (or 'lit') IP addresses.

Monday, August 11, 2008

Custom software development

Custom software development, also known as custom software engineering, defines omitting of predeveloped (template or boxed) solutions and views. If a customer wants to believe that their favour is unique and putting their ideas into practice, custom software development technologies are custom approaches to solving their problems. Finding new creative decisions in order to meet the specific requirements and preferences of the customer as quickly as possible may be achieved with custom software development.

A goal of custom software solutions may be to develop not for the mass auditorium (users), but rather developed to be unique, for a single customer (user) or a group. Custom developed software is encouraged to take under one hat the most progressive technologies alone with preferences and expectations of the customer. Custom developed software may be designed in stage by stage processes, allowing all nuances and possible hidden dangers to be taken into account, including issues which were not mentioned in the specifications.

Pre-developed software packages, in most cases, may not be modifiable or customized to ones needs; and are usually available to all unrelated users or groups of users. For example, software designed for a cell phone manufacturer would be 'custom,' even though there could be thousands of individual users. Software written for use by many other manufacturers would be packaged, even if there were only a single user in each factory.

Tuesday, August 05, 2008

VersionTracker

VersionTracker.com is a website that tracks software releases. It started out originally as a Mac OS software tracker, eventually expanding into Mac OS X, Microsoft Windows and Palm OS.

VersionTracker does not host the majority of the software listed (it merely links to them), only in special agreements with the developers.

VersionTracker also offers a software called VersionTracker Pro that checks software versions on a user's computer and then queries its database to see if any updates are available. This feature is available only to paid subscribers. Browsing and searching the database is free.

Tuesday, July 29, 2008

Neogeography

Neogeography literally means "new geography", and is commonly applied to the usage of geographical techniques and tools used for personal and community activities or for utilization by a non-expert group of users. Application domains of neogeography are typically not formal or analytical.

The term and field owes much of its inspiration to the Locative media movement that sought to expand the use of location-based technologies to personal expression and society.

Traditional GIS Geographic Information Systems historically have developed tools and techniques targeted towards formal applications that require precision and accuracy. By contrast, Neogeography tends to apply to the areas of approachable, colloquial applications. The two realms can have overlap as the same problems are presented to different sets of users: experts and non-experts.

Monday, July 21, 2008

Database integrity

Database integrity ensures that data entered into the database is accurate, valid, and consistent. Any applicable integrity constraints and data validation rules must be satisfied before permitting a change to the database.

Three basic types of database integrity constraints are:

* Entity integrity, allowing no two rows to have the same identity within a table.
* Domain integrity, restricting data to predefined data types, e.g.: dates.
* Referential integrity, requiring the existence of a related row in another table, e.g. a customer for a given customer ID.