Parameterized algorithms is a fast-growing discipline in the area of algorithm design, where in the primary agenda is to work around NP-hardness by developing algorithms that are both fast and accurate in most situations of practical importance. The area, whose foundations were laid down by Downey and Fellows in the nineties, has seen rapid growth in both the theoretical and engineering contexts.

This is a community blog, and our intention is to collect expositions and announcements of results related to the area of parameterized algorithms. Please see the community wiki for a comprehensive selection of resources. Also see: questions in cstheory tagged parameterized-complexity.

A brief introduction to the primary goal of a parameterized algorithm is as follows. A *parameterization *of a problem is assigning an integer $k$ to each input instance and we say that a parameterized problem is *fixed-parameter tractable (FPT)* if there is an algorithm that solves the problem in time $f(k)\cdot |I|^{O(1)}$, where $|I|$ is the size of the input and $f$ is an arbitrary computable function depending on the parameter $k$ only.

Just as NP-hardness is used as evidence that a problem probably is not polynomial time solvable, there exists a hierarchy of complexity classes above FPT, and showing that a parameterized problem is hard for one of these classes gives evidence that the problem is unlikely to be fixed-parameter tractable.

A parameterized problem is said to admit a *polynomial kernel* if every instance $(I,k)$ can be reduced in polynomial time to an equivalent instance with both size and parameter value bounded by a polynomial in $k$. The study of kernelization is a major research frontier of parameterized complexity and many important recent advances in the area are on kernelization. These include general results showing that certain classes of parameterized problems have polynomial kernels or randomized kernels.

The recent development of a framework for ruling out polynomial kernels under certain complexity-theoretic assumptions has added a new dimension to the field and strengthened its connections to classical complexity.