summaryrefslogtreecommitdiffstats
path: root/kernel/sched_fair.c (follow)
Commit message (Expand)AuthorAgeFilesLines
* sched: Fix select_idle_sibling() logic in select_task_rq_fair()Suresh Siddha2010-04-231-42/+40
* sched: Pre-compute cpumask_weight(sched_domain_span(sd))Peter Zijlstra2010-04-231-7/+5
* sched: Add enqueue/dequeue flagsPeter Zijlstra2010-04-021-17/+8
* sched: Fix TASK_WAKING vs fork deadlockPeter Zijlstra2010-04-021-2/+6
* Merge branch 'linus' into sched/coreIngo Molnar2010-04-021-1/+1
|\
| * sched, rcu: Fix rcu_dereference() for RCU-lockdepPaul E. McKenney2010-03-011-1/+1
* | sched: Remove AFFINE_WAKEUPS featureMike Galbraith2010-03-111-2/+1
* | sched: Remove ASYM_GRAN featureMike Galbraith2010-03-111-17/+11
* | sched: Remove WAKEUP_SYNC featureMike Galbraith2010-03-111-4/+0
* | sched: Remove FAIR_SLEEPERS featureMike Galbraith2010-03-111-1/+1
* | sched: Remove NORMALIZED_SLEEPERMike Galbraith2010-03-111-10/+0
* | sched: Fix select_idle_sibling()Mike Galbraith2010-03-111-4/+10
* | sched: Tweak sched_latency and min_granularityMike Galbraith2010-03-111-6/+6
* | sched: Cleanup/optimize clock updatesMike Galbraith2010-03-111-2/+0
* | sched: Remove avg_overlapMike Galbraith2010-03-111-18/+0
* | sched: Remove avg_wakeupMike Galbraith2010-03-111-31/+0
* | sched: Implement group scheduler statistics in one structLucas De Marchi2010-03-111-32/+33
|/
* sched: Fix SCHED_MC regression caused by change in sched cpu_powerSuresh Siddha2010-02-261-33/+43
* Merge branch 'sched/urgent' into sched/coreThomas Gleixner2010-02-161-2/+13
|\
* | Merge branch 'sched/urgent' into sched/coreIngo Molnar2010-02-081-1/+1
|\|
| * sched: Fix vmark regression on big machinesMike Galbraith2010-01-211-1/+1
* | sched: Extend enqueue_task to allow head queueingThomas Gleixner2010-01-221-1/+2
* | sched: Fix the place where group powers are updatedGautham R Shenoy2010-01-211-4/+3
* | sched: Assume *balance is validPeter Zijlstra2010-01-211-3/+3
* | sched: Remove load_balance_newidle()Peter Zijlstra2010-01-211-122/+13
* | sched: Unify load_balance{,_newidle}()Peter Zijlstra2010-01-211-56/+59
* | sched: Add a lock break for PREEMPT=yPeter Zijlstra2010-01-211-0/+4
* | sched: Remove from fwd declsPeter Zijlstra2010-01-211-67/+60
* | sched: Remove rq_iterator from move_one_taskPeter Zijlstra2010-01-211-110/+36
* | sched: Remove rq_iterator usage from load_balance_fairPeter Zijlstra2010-01-211-51/+29
* | sched: Remove the sched_class load_balance methodsPeter Zijlstra2010-01-211-29/+37
* | sched: Move load balance code into sched_fair.cPeter Zijlstra2010-01-211-0/+1765
* | sched: Don't expose local functionsH Hartley Sweeten2010-01-171-1/+1
|/
* sched: Remove the cfs_rq dependency from set_task_cpu()Peter Zijlstra2009-12-161-6/+44
* sched: Select_task_rq_fair() must honour SD_LOAD_BALANCEPeter Zijlstra2009-12-161-0/+3
* sched: Convert rq->lock to raw_spinlockThomas Gleixner2009-12-141-2/+2
* sched: Update normalized values on user updates via procChristian Ehrhardt2009-12-091-1/+10
* sched: Make tunable scaling style configurableChristian Ehrhardt2009-12-091-0/+13
* sched: Fix missing sched tunable recalculation on cpu add/removeChristian Ehrhardt2009-12-091-0/+16
* sched: Remove unnecessary RCU exclusionPeter Zijlstra2009-12-091-7/+2
* sched: Discard some old bitsPeter Zijlstra2009-12-091-3/+0
* sched: Clean up check_preempt_wakeup()Peter Zijlstra2009-12-091-40/+33
* sched: Move update_curr() in check_preempt_wakeup() to avoid redundant callJupyung Lee2009-12-091-2/+2
* sched: Sanitize fork() handlingPeter Zijlstra2009-12-091-13/+15
* sched: Protect sched_rr_get_param() access to task->sched_classThomas Gleixner2009-12-091-5/+1
* Merge branch 'sched/urgent' into sched/coreIngo Molnar2009-11-261-27/+47
|\
| * sched: Strengthen buddies and mitigate buddy induced latenciesMike Galbraith2009-10-231-26/+47
| * sched: Do less agressive buddy clearingPeter Zijlstra2009-10-141-14/+13
* | sched: Optimize branch hint in pick_next_task_fair()Tim Blechmann2009-11-241-1/+1
* | sched: More generic WAKE_AFFINE vs select_idle_sibling()Peter Zijlstra2009-11-131-17/+16