summaryrefslogtreecommitdiffstats
path: root/arch/x86/lib/memset_64.S (follow)
Commit message (Expand)AuthorAgeFilesLines
* x86/asm: Change all ENTRY+ENDPROC to SYM_FUNC_*Jiri Slaby2019-10-181-2/+2
* x86/asm: Make some functions localJiri Slaby2019-10-181-4/+4
* x86/asm: Annotate aliasesJiri Slaby2019-10-181-2/+2
* License cleanup: add SPDX GPL-2.0 license identifier to files with no licenseGreg Kroah-Hartman2017-11-021-0/+1
* x86: move exports to actual definitionsAl Viro2016-08-081-0/+3
* Merge branch 'x86/cleanups' into x86/urgentIngo Molnar2016-03-171-1/+1
|\
| * x86: Fix misspellings in commentsAdam Buchbinder2016-02-241-1/+1
* | x86/cpufeature: Carve out X86_FEATURE_*Borislav Petkov2016-01-301-1/+1
|/
* x86/debug: Remove perpetually broken, unmaintainable dwarf annotationsIngo Molnar2015-06-021-5/+0
* x86/lib/memset_64.S: Convert to ALTERNATIVE_2 macroBorislav Petkov2015-02-231-37/+24
* x86/alternatives: Add instruction paddingBorislav Petkov2015-02-231-2/+2
* x86_64: kasan: add interceptors for memset/memmove/memcpy functionsAndrey Ryabinin2015-02-141-4/+6
* x86-64: Fix memset() to support sizes of 4Gb and aboveJan Beulich2012-01-261-18/+15
* x86, mem: memset_64.S: Optimize memset by enhanced REP MOVSB/STOSBFenghua Yu2011-05-181-12/+42
* x86, alternatives: Use 16-bit numbers for cpufeature indexH. Peter Anvin2010-07-071-1/+1
* x86-64: Modify memcpy()/memset() alternatives mechanismJan Beulich2009-12-301-12/+6
* x86_64: move libThomas Gleixner2007-10-111-0/+133