| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
| |
There is no point in intentionally disabling this feature; it
isn't enabled by default by the compiler - and if it were to be
enabled by default (if the toolchain baseline includes it, because
it can be expected to be supported everywhere) there would be no
reason to disable it.
Bug: 22860270
Change-Id: I67019eea63c4fb7183d9e47cf16bc8485022fef2
|
|
|
|
|
|
|
|
| |
If the environment already indicates that the compiler targets
either x86 or x86_64, we don't need to enforce this by trying to
force the compiler to output a specific bitness.
Change-Id: Ife6e717e90b4da4edd852dcd66ad92dba70939a2
|
|
|
|
| |
Change-Id: I61b30d0934715cddd54b66ea3b023b2316a0106f
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The following defines aren't referenced anywhere in the code base:
_LIB MULTICORE APPLY_CONCEALMENT THREAD_QUAD_CORE DISABLE_NEONINTR
ARMGCC
The DEFAULT_ARCH define isn't used within the encoder at all.
The ANDROID define isn't referenced anywhere either, and if necessary,
the automatically enabled define __ANDROID__ can be used instead.
The defines INSERT_LOGO and LOGO_EN were undefined in the makefiles,
but that is unnecessary since nothing actually defines them. The
decoder x86_64 makefile also undefined LOGO_EN; an architecture
specific makefile shouldn't touch such feature settings, otherwise
there's a risk that different builds for different architectures
behave significantly differently.
Change-Id: I13b86c8bf2feb3a381d904a13f18c3b35f40a575
|
|
|
|
| |
Change-Id: Ia4f99d5b963acd8d8a1afc2fbdf06b122d898f63
|
|
Change-Id: I7efe9a589cd24edf86e8d086b40c27cbbf8b4017
|