Changing icons for pause/resume video by Johan Ejdemark ( johanejdemark AT hotmail DOT com).
Azerbaijani translation by Eldost ( l-dost AT mail DOT ru ).
Brazilian tranlation by Kaio Duarte.
-
Chinese Simplified translation by Michael Lu ( yeskky AT gmail DOT com ) and tumuyan ( tumuyan AT gmail DOT com ).
+
Chinese Simplified translation by Michael Lu ( yeskky AT gmail DOT com ), tumuyan ( tumuyan AT gmail DOT com ) and Tommy He.
Chinese Traditional translation by You-Cheng Hsieh ( yochenhsieh AT gmail DOT com ) and Hsiu-Ming Chang.
Belarusian translation by Zmicer Turok.
Czech translation by Jaroslav Svoboda ( multi DOT flexi AT gmail DOT com , http://jaroslavsvoboda.eu ).
French translation by Olivier Seiler ( oseiler AT nebuka DOT net ) and Eric Lassauge ( lassauge AT users DOT sf DOT net ).
-
German translation by Ronny Steiner, Sebastian Ahlborn, Carsten Schlote, Wilhelm Stein.
+
German translation by Ronny Steiner, Sebastian Ahlborn, Carsten Schlote, Wilhelm Stein, Jochen Wiesel.
Greek translation by Wasilis Mandratzis-Walz.
Hungarian translation by Báthory Péter.
-
Italian tranlation by Valerio Bozzolan, Stefano Gualmo ( s DOT gualmo AT gmail DOT com ).
+
Italian tranlation by Valerio Bozzolan, Stefano Gualmo ( s DOT gualmo AT gmail DOT com ), Renato Giliberti.
Japanese translation by Mitsuse and Yanagimoto Yoshiaki.
Korean translation by Halcyonest.
Norwegian Bokmål translation by Imre Kristoffer Eilertsen ( imreeil42 AT gmail DOT com ).
-
Polish translation by Jacek Buczyński.
+
Polish translation by Jacek Buczyński and Grzegorz Koryga.
-
Russian translation by maksnogin ( maksnogin AT gmail DOT com ), Grigorii Chirkov, Dmitry Vahnin aka JSBmanD, Aleksey Khlybov.
+
Russian translation by maksnogin ( maksnogin AT gmail DOT com ), Grigorii Chirkov, Dmitry Vahnin aka JSBmanD, Aleksey Khlybov, Ilya Pogrebenko.
Slovenian translation by Peter Klofutar.
Spanish translation by Mario Sanoguera ( sanogueralorenzo AT gmail DOT com , https://play.google.com/store/apps/developer?id=Mario+Sanoguera ; Sebastian05067, https://forum.xda-developers.com/member.php?u=6302705 ) and Gonzalo Prieto Vega.
Turkish translation by Serdar Erkoc ( serdarerkoc2004 AT yahoo DOT com ).
In general, Google Nexuses have worked well for Open Camera. Firstly because Google provide good support for their own
- Android camera API, both old and Camera2 (as you'd hope!) - and they use it for their own Google camera. Secondly because
- I've owned several Nexuses (Galaxy Nexus, Nexus 7, Nexus 6), and so have been able to test Open Camera against them.
+
In general, Google Nexuses and Pixels have worked well for Open Camera.
-
Camera2 on the Nexus 6 works well (there are some minor issues, e.g., manual exposure doesn't work well when recording
+
Camera2 API on the Nexus 6 works well (there are some minor issues, e.g., manual exposure doesn't work well when recording
video). It's hard to be sure about other Nexuses though.
-
In theory all this should apply to the Pixels, but I haven't been able to test - though from what people tell me, things
-seem to work including Camera2 features. Open Camera also supports Google's HDR+ mode on the Pixels with Pixel Visual Core.
+
Similarly Camera2 API works well on the Pixel 6 Pro, although some minor issues (manual white balance doesn't work, some
+ of the Video picture profiles don't work). Open Camera also supports Google's HDR+ mode on the Pixels with Pixel Visual Core
+ (including the Pixel 6 Pro). As of Open Camera 1.50, Night Sight on the Pixel 6 Pro is available via the photo mode X-Night.
+ As of Open Camera 1.50, all of the Pixel 6 Pro's cameras are available to use by zooming in or out.
Color effects don't work on the Nexus 7.
@@ -168,16 +167,21 @@ manual controls, RAW and 120fps video.
API. Known issues are:
Slow motion and high speed frame rate video doesn't work (see below for more details).
-
The maximum manual shutter speed allowed for third party camera applications is 0.1s (most devices
- seem to allow at least 0.5s).
The "Image quality" setting has no effect for JPEGs (unless post-processing options such as auto-level or
photo stamp are applied). This has also been reported for other Samsung devices; I also have the same
issue with other third party camera applications on my S10e. See
this thread
for details.
+
HDR and Expo bracketing does not work properly on Android 11+ (the image that is meant to be a darker
+ exposure does not come out dark). This can be resolved in Open Camera by enabling the option
+ Settings/Photo settings/"Enable dummy capture HDR/expo fix" (and making sure
+ "Enable fast HDR/expo burst" is also enabled).
-
Both rear physical cameras are available to Open Camera; also the two modes for the front camera ("cropped" and
- "wide") are available to Open Camera.
+
Both rear physical cameras (standard and ultra wide) are available to Open Camera (as separate cameras); also the two modes for the front camera
+ ("cropped" and "wide") are available to Open Camera.
+
+
At least some Samsung Galaxy devices support the camera extension modes (X-Auto, X-Night, X-Bokeh, X-Bty) (including the Galaxy S10e;
+in general this is likely available for the flagship S devices running Android 12+).
More generally I have occasionally tested on various Samsung devices using their remote test labs - although useful, this is limited
compared to owning a real device (especially when the test labs are dark!)
@@ -205,7 +209,7 @@ manual controls, RAW and 120fps video.
Note 9 (these articles
are for Filmic Pro, but the issues faced likely affect all third party camera applications, including Open Camera).
-
On a positive note, the Galaxy Note 4 and 5 were used with Open Camera to film
+
diff --git a/_docs/focus_mode_auto.png b/_docs/focus_mode_auto.png
index 56b64ba353ee10c7756b2f35c367e0839ccd652e..7e48c33a32bccdcd0e3eefd092a610803a017922 100644
Binary files a/_docs/focus_mode_auto.png and b/_docs/focus_mode_auto.png differ
diff --git a/_docs/focus_mode_continuous_picture.png b/_docs/focus_mode_continuous_picture.png
index 1c8f98e5a6250313e83a6e3a74aaedcb44093cea..51d42c162e90551f8a98d8d915d950a3fc22e8d4 100644
Binary files a/_docs/focus_mode_continuous_picture.png and b/_docs/focus_mode_continuous_picture.png differ
diff --git a/_docs/focus_mode_edof.png b/_docs/focus_mode_edof.png
index e54169327373a7777fa9b1565c0e2eddb1271594..4e01fb608f27c7904e65c480ec9c7f7c4715d841 100644
Binary files a/_docs/focus_mode_edof.png and b/_docs/focus_mode_edof.png differ
diff --git a/_docs/focus_mode_fixed.png b/_docs/focus_mode_fixed.png
index 7ae033fe616d83bd6148d17c3afbe63eeedc2f7d..420daef09df19498698759d0561c1834f4e407c9 100644
Binary files a/_docs/focus_mode_fixed.png and b/_docs/focus_mode_fixed.png differ
diff --git a/_docs/focus_mode_infinity.png b/_docs/focus_mode_infinity.png
index 745587bd2659f2ffdc870fb42bb3301504a9240f..649110367e5b3c8d27ad805aa513314d3c657033 100644
Binary files a/_docs/focus_mode_infinity.png and b/_docs/focus_mode_infinity.png differ
diff --git a/_docs/focus_mode_manual.png b/_docs/focus_mode_manual.png
index 09171291ab76e80f3c019bc675aeae0b308be68b..c4341672b3ad641f5c9d11526af2583016a44a64 100644
Binary files a/_docs/focus_mode_manual.png and b/_docs/focus_mode_manual.png differ
diff --git a/_docs/help.html b/_docs/help.html
index 0ebedc42bfd0a85fede2cb6f334e482532fc31a2..534d9d3370173764e9ba748b8d260a0049932cb5 100644
--- a/_docs/help.html
+++ b/_docs/help.html
@@ -31,16 +31,15 @@
-
+
+
+
@@ -124,7 +123,7 @@ a photo. In some cases, you can also hold (long press) for a continuous burst:
smaller video icon will switch to video mode. The photo and video icons will then swap: click the larger video icon to
start/stop video recording, and click the smaller photo icon to switch back to photo mode.
-
Switch camera - Switches between front and back camera (if your
+
Switch camera - Switches between front and back camera (if your
device has both front and back cameras). If your device has more than one front and/or back camera, then this will switch
between the first front and back camera.
@@ -134,7 +133,9 @@ between the first front and back camera.
the standard and ultra-wide camera.
If Settings/On screen GUI/"Multiple cameras icon" is disabled, then this icon will not show; instead the "Switch camera"
icon can by used to cycle through all the cameras.
- Note that some devices do not allow third party applications to access their multiple cameras, in which case Open Camera isn't
+ Note that some devices do not expose the multiple cameras explicitly, but instead will automatically switch cameras as required
+ when zooming in or out (requires Camera2 API).
+ Note that some other devices do not allow third party applications to access their multiple cameras at all, in which case Open Camera isn't
able to use them.
Exposure lock - Click to lock or unlock the exposure.
@@ -195,7 +196,8 @@ will be available which instead work by making the screen light up (note, front
Manual -
A slider appears allowing you to manually control the focus distance (only available if Camera2
- API is used).
+ API is used). Also see the options "Focus assist" and "Focus peaking" under Settings/Camera preview/
+ which may be useful when using manual focus.
Fixed -
The focus remains fixed.
@@ -226,7 +228,20 @@ will be available which instead work by making the screen light up (note, front
taken, at infinite focus distance. Focus bracketing is typically used with
Focus stacking software to merge the images into a single
photo. Note that whilst taking a set of focus bracketed photos, you can cancel the set by pressing the "take photo"
- button again.
+ button again. Also see the options "Focus assist" and "Focus peaking" under Settings/Camera preview/
+ which may be useful when adjusting the focus distances.
+
X- modes - These extension modes enable device specific algorithms or effects that manufacturers have exposed to
+ third party applications (via Android's camera extensions API) (requires Android 12; only available on some devices,
+ and if Camera2 API is used).
+
+
X-Auto: Allows the device to choose which algorithm to use based on the current scene. Note this differs to
+ "STD" mode in that it allows the use of the other camera extensions e.g. Night mode for low light scenes.
+
X-Night: Improves image quality under low light conditions.
+
X-Bokeh: Blurs the background of photos. This is typically intended when taking portraits of people.
+
X-Bty: Face retouch or "beauty", applies cosmetic effects to people's faces.
+
+ Note many features are unavailable when using an extension mode, including flash, zoom and manual controls.
+
Auto-level - Enable the auto-level feature for photos (see
below). (Only available if the device has enough memory.)
@@ -265,7 +280,7 @@ has a hardware menu button, pressing that should also open the settings.)
photo/video (by default saved in the OpenCamera folder). If you get the message "No Gallery app available", then you should install
a Gallery app.
You can also "long press" on the Gallery icon - this will let you switch between the recent save locations, or take you straight to a
-file dialog to choose a save location if additional locations have yet been defined. See
+dialog to choose a save location if additional locations have yet been defined. See
Save location under Settings/More camera controls for more details.
Pause video - When recording
@@ -296,8 +311,8 @@ cached location). If the location isn't available, a dash will be shown through
photos look right whether you hold the device in "portrait" or "landscape" mode. But Open Camera has the option to rotate the
photos so they are perfectly level, so your shots come out looking perfectly level every time!
The above shows a rather exaggerated example - in practice, you can probably take better photos, but this feature ensures they
@@ -338,6 +353,7 @@ algorithm is in reducing noise and enhancing detail.
If you have a Google Pixel with Pixel Visual Core, you should get Google's HDR+ photos when using Open Camera's Standard photo mode, so there
is generally little benefit to using Open Camera's NR mode on these devices. See "Does Open Camera support HDR+" in the
FAQ for more details.
+
If the photo mode "X-Night" is available, this may get better results for low light scenes.
In Noise Reduction photo mode, an additional "NR mode" option will appear on the popup mode. This defaults to Normal, but
@@ -345,8 +361,6 @@ you can change to "Low Light" mode, which further improves results in dark scene
mode, it will take a burst of images for a duration of around 5 seconds. For best results, use a tripod, or try to hold the
camera as steady as possible.
Dynamic Range Optimisation (DRO) is a technique that optimises the dynamic range available in the image. In particular, dark
@@ -360,13 +374,13 @@ see DRO vs HDR.
High Dynamic Range Imaging (HDR) is a technique where the camera takes multiple shots at different exposures, and combines them
into a single image. A typical problem in photography is that a scene may contain a brightness range that is wider than what can be
-captured in a single shot. Varying the exposure (whether by touching on the screen, or exposure compentation or manual exposure) might
-make darker regions brighter, but leave other areas over-exposured. Or reducing the exposure to prevent over-exposure may result in
+captured in a single shot. Varying the exposure (whether by touching on the screen, or exposure compensation or manual exposure) might
+make darker regions brighter, but leave other areas over-exposed. Or reducing the exposure to prevent over-exposure may result in
the rest of the scene being too dark. HDR uses an algorithm to combine the best parts of each image, and adjusts the colors so that
the full range of brightness values are captured in the scene:
The left set of three images show the individual exposures, the right the final HDR image.
@@ -403,13 +417,6 @@ three separate images, rather than reprocessing a single image.
than Standard photo mode). Though NR still has the advantage that it is less prone to ghosting and misalignment. NR is also better
suited to working in a wide range of scenes.
-
The set of images below shows: Standard photo on the left, Noise Reduction in the middle, HDR on the right. There is less overexposure
-in the sky in the Noise Reduction result, but the HDR mode does even better.
In summary: NR is better if you just want a "works best in most cases" option. HDR may be a better choice specifically in
scenes with high dynamic range, that also don't have movement in the scene.
@@ -494,7 +501,7 @@ affects taking photos.
Pause after taking photo - If ticked, after taking a photo the display will pause, with options to share
or delete
- the
+ the
photo. To keep the photo and continue, touch the screen, press back, or take another photo. Note that this isn't supported
when holding the shutter button to take a continuous burst of photos.
@@ -522,17 +529,18 @@ key to focus, then both to take a photo.
Audio control options - If enabled, this allows taking a photo (or starting video recording, depending on the mode)
by making a noise. An on-screen microphone button will appear, to
start/stop listening. The "loud noise" option will listen for any noise (so you can remotely take a photo by saying "cheese",
-whistling, or whatever you prefer). Note that leaving the listening turned on may use additional battery. The "voice command" option
-listens specifically for saying "cheese" - so this has the advantage that it's less likely to be triggered unintentionally.
+whistling, or whatever you prefer). Note that leaving the listening turned on may use additional battery.
Note that this can't be used to stop video recording - if you want to have some remote control on video recording,
see the "Max duration of video" option.
Audio control sensitivity - This controls how sensitive Open Camera is to noises, if "Audio control" is set to "Loud noise".
If you find it's taking photos too often unintentionally, or isn't responding to your sounds, try adjusting this option.
-
Bluetooth LE remote control - Open Camera supports connecting to the Kraken Smart Housing via the options in
-these settings. Once connected via Bluetooth, it should be possible to control Open Camera from the device. The on-screen
-display of Open Camera will also display information from the Kraken (temperature and depth).
+
Bluetooth LE remote control - Open Camera supports connecting to some specific "smart housing" cases via the
+ options in these settings. See "Remote device type" for supported types. At the time of writing, only one make/model
+ is supported. Once connected via Bluetooth, it should be possible to control Open Camera from the device.
+ The on-screen display of Open Camera will also display information from the housing (temperature and depth).
Lock photo/video orientation - Normally the orientation of the photo/video will be rotated by some multiple of
90 degree such that the orientation looks right - e.g. if your device is held in portrait, the resultant image/video will
@@ -540,19 +548,28 @@ be in portrait. This option allows fixing the camera to either be in portrait or
auto-level is also enabled, it will have the effect of aligning photos to the nearest
90 degrees.
-
Save location - Select the folder to store the photos in. Click on a folder
-(or "Parent Folder") to navigate through the filesystem. Select "New Folder" to create a new folder in the currently
-displayed folder. Select "Use Folder" to choose the currently displayed folder. Note that on Android, there are some
-folders that cannot be written to - Open Camera will display a message if you try to use one of these folders. Once
-you have specified a new save location, you can long press on the Gallery icon to quickly switch between recent save
-locations. Note that if "Use Storage Access Framework" is selected, this option will instead show up the Android standard
-file chooser - navigate to the desired folder, and click "SELECT". If you want to save to an SD card, see "How can I
-save to my external SD card?" under the FAQ.
+
Save location - Select the folder to store the resultant photos or videos in.
+
+
On Android 9 or earlier: This opens a file dialog. Click on a folder (or "Parent Folder") to navigate through
+ the filesystem. Select "New Folder" to create a new folder in the currently displayed folder. Select "Use Folder"
+ to choose the currently displayed folder. Note that on Android, there are some folders that cannot be written
+ to - Open Camera will display a message if you try to use one of these folders.
+
On Android 10 or later: This opens a dialog to type the name of the folder. This
+ will be a subfolder of DCIM on your internal storage. You can specify subfolders with the "/"
+ character. For example, specifying Camera/holiday will save inside DCIM/Camera/holiday/
+ on your internal storage.
+
If "Storage Access Framework" is enabled: Then on any Android version, this option
+ will show up the Android standard file chooser - navigate to the desired folder, and click "SELECT" or
+ "ALLOW ACCESS" (wording varies depending on Android version).
+
+
Once you have specified a new save location, you can long press on the Gallery icon to quickly switch between recent save
+ locations. If you want to save to an SD card, see "How can I save to my external SD card?" under the FAQ.
-
Use Storage Access Framework - If selected, Open Camera will instead use the Android
+
Storage Access Framework - If selected, Open Camera will instead use the Android
Storage Access Framework. This
-has some advantages, such as using the standard Android file picker, and being the only way to save to SD cards on Android 5
-- though it may be slightly slower to take photos. (Requires Android 5.0 or higher.)
+has some advantages, such as using the standard Android file picker, and being the only way to save to SD cards on Android 5+.
+In some cases it may allow you to save to cloud or local storage providers provided by other apps or services.
+Furthermore on Android 10+, it is the only way to save outside of the DCIM/ folder. (Requires Android 5.0 or higher.)
Distance unit - If "Stamp photos" is enabled, this controls whether to use metres (m) or
feet (ft) when recording the GPS altitude. Also used for Video settings/"Video subtitles".
@@ -942,6 +965,11 @@ Camera2 API. A common issue is poor flash behaviour (either flash doesn't fire,
If so, enabling this option may help - this uses an alternative algorithm for flash (using the torch to simulate flash
as a workaround). Note that this is enabled by default for Samsung devices.
+
Enable dummy capture HDR/expo fix - (Camera2 API only.) Enable this option if your device has problems taking photos
+in HDR or Exposure Bracketing photo modes, specifically if some expo images come out with the same exposures. This option
+takes an additional "dummy" image which may resolve such problems. Note that "Enable fast HDR/expo burst" (below) must
+be enabled for this option to have an effect.
+
Enable fast HDR/expo burst - (Camera2 API only.) Disable this option if your device has problems taking photos
in HDR or Exposure Bracketing photo modes (disabling this option will result in a longer delay between the photos being
taken, but may give more stable behaviour if your device is having problems with this).
Video format - Allows choice of various video file formats and codecs. Please test before using, as some may
not work properly on all devices! Also note:
-
WebM does not support recording video (at the time of writing, it seems encoding in Vorbis audio format is
+
WebM does not support recording audio (at the time of writing, it seems encoding in Vorbis audio format is
not supported on
Android).
WebM does not support storing location data ("Store location data" option).
@@ -1026,11 +1054,17 @@ behaviour of these buttons).
Video subtitles - This option is analogous to the "Stamp photos" option, but rather than embedding text into
the video itself, it stores the text in a separate subtitles
-(".SRT") file. Most decent video players should support
+(".SRT") file. Most decent video players should support
SRT files, and use them to display the information as subtitles. The subtitles will record the date and time. If "Store location data" is enabled (see
"Location settings" below), then the current location latitude and longitude coordinates will also be recorded (if the location is
known). Similarly for "Store compass direction". Note that you can control the formatting style for date, time and location using
-the options under the "Photo settings" menu (Datestamp format, Timestamp format, GPS stamp format, Distance unit).
+ the options under the "Photo settings" menu (Datestamp format, Timestamp format, GPS stamp format, Distance unit).
+
+
Note that on Android 10, using this option means the ".SRT" files will show in most gallery apps as separate unplayable video
+ files. A workaround is to enable Settings/More camera controls/"Storage Access Framework".
+
On Android 11+, this option is only available if Settings/More camera controls/"Storage Access Framework" is
+ enabled. This is due to changes in Android 11 which affect how applications are able to save files.
+
Video bitrate (approx) - If set to a value other than "default", the default video bitrate is overridden. Higher values mean better
quality video, but the files take up more disk space. Note that some values may be unsupported by your device, and may
@@ -1083,7 +1117,7 @@ location. Location data will also be stored in videos (though only for devices t
Store compass direction - If selected, then photos will be tagged with the compass direction.
Only supported for JPEG format. Not supported for RAW photos (DNG format) or videos.
-
Store compass direction - If selected, then photos will be tagged with the device's yaw, pitch and roll.
+
Store yaw, pitch and roll - If selected, then photos will be tagged with the device's yaw, pitch and roll.
Note that Exif data does not have direct support for this, instead it will be written as a string in the Exif data's
User Comment for the image. Only supported for JPEG format. Not supported for RAW photos (DNG format) or videos.
@@ -1115,9 +1149,6 @@ specified by using this option.
Online help - Load this web page.
-
Donate to support development - Loads the page for
-my donation app.
-
Camera API - If set to "Camera2 API", this enables support for the Camera2 API that was introduced
in Android 5. Changing this setting will cause Open Camera to restart. Camera2 API enables more advanced features
(including manual ISO/exposure, manual focus, HDR, exposure bracketing).
@@ -1151,7 +1182,8 @@ this information to the clipboard.
is the risk that settings on some devices may be incompatible with other devices. Also if the
saved settings file specified a save location, this may not be valid on the new device (or if
using "Storage Access Framework", you may have to reselect the folder in Open Camera, to grant
- write permission for the folder).
+ write permission for the folder). Note that on Android 10+, the file dialog will only let you
+ select a file inside Android/data/net.sourceforge.opencamera/files/.
Reset settings - Resets all Open Camera settings to their default. Selecting this option will cause
Open Camera to restart. Note that this will not delete any saved settings (see above options).
@@ -1211,21 +1243,16 @@ e.g., "extSdCard".
Android 4.4 - Unfortunately it is not possible for 3rd party apps to write to external SD cards.
This is not a bug or missing feature in Open Camera, rather that
Google
-have blocked write access to external SD cards in Android 4.4. There is one exception: if you use a
-folder in Android/data/net.sourceforge.opencamera/ on your SD card (use the "New folder" option to create
-the folders/sub-folders if necessary), that should allow saving to the SD card. Note that all files in this
-folder will be deleted if you uninstall Open Camera, so be careful that you don't lose your photo/videos
-if you use this method, and then uninstall!
+have blocked write access to external SD cards in Android 4.4.
Android 5.0 onwards - The restrictions on SD cards introduced in Android 4.4 still apply, however
-instead you can enable Settings/More camera controls/"Use Storage Access Framework", and this should allow you
+instead you can enable Settings/More camera controls/"Storage Access Framework", and this should allow you
to save to external SD cards. If when choosing a folder, you only see "Recent", you may need to click on the
-three dots at the top right to open the menu, to enable showing the drives (e.g., "Show internal storage".
+three dots at the top right to open the menu, to enable showing the drives (e.g., "Show internal storage").
Android 6.0 onwards - From Android 6, some devices support "Adoptable Storage" allowing you to
select to use
an SD card as internal storage. Note that not all devices support this, even if running Android 6 or later. If your device
doesn't support this, or you want to instead use an SD card as "portable storage", you'll have to use
-the Storage Access Framework method as with Android 5 (or the Android/data/net.sourceforge.opencamera/ method as
-with Android 4.4).
+the Storage Access Framework method as with Android 5.
Can you implement disabling shutter sound for my phone? -
@@ -1244,7 +1271,7 @@ with Android 4.4).
Photos or videos fail to save! - Firstly, if you're trying to save to an
external SD card, see "How can I save to my external SD card?" above. Otherwise:
-
If Settings/More camera controls/"Use Storage Access Framework" is enabled,
+
If Settings/More camera controls/"Storage Access Framework" is enabled,
in some cases the permission may be lost, try rechoosing the save location (from
Settings/More camera controls/"Save location").
If not using Storage Access Framework, but you have changed the save location,
@@ -1256,7 +1283,7 @@ external SD card, see "How can I save to my external SD card?" above. Otherwise:
I switched to a new phone, and now something doesn't work! - Google's auto-backup will
typically transfer settings to a new phone, but this may mean a camera-specific setting is no
longer relevant. In particular, if you set a non-default save location, it may be that the path is
-not valid on the new device, or if using Settings/More camera controls/"Use Storage Access Framework",
+not valid on the new device, or if using Settings/More camera controls/"Storage Access Framework",
you may need to rechoose the save location (from Settings/More camera controls/"Save location") to
grant permission to the new device. You can use Settings/Settings manager/"Reset settings" to reset
Open Camera to its original state, to rule out any issues from an Android backup from another
@@ -1271,9 +1298,9 @@ and untick Auto-level.
that are made available to third party applications. Usually this means front and back cameras, but some devices have
multiple front and/or back-facing cameras. Use the
switch multi-camera icon
-to switch between multiple front or back cameras. In some cases the extra cameras aren't made available to third party
-applications, so it isn't possible for Open Camera to support them. Even where they are, since there is no current standard
-on what the extra cameras are used for, it can't do anything with them other than allowing you to switch between them.
+to switch between multiple front or back cameras. Note that some devices do not expose the multiple cameras explicitly,
+but instead will automatically switch cameras as required when zooming in or out. In some cases the extra cameras aren't
+made available to third party applications, so it isn't possible for Open Camera to support them.
Why doesn't Open Camera support the maximum video resolution on my device? - If you
are using Camera2 API, make sure that you're not in slow motion mode (see "Speed" under
@@ -1467,6 +1494,8 @@ can paste the information into your web browser, email or whatever.
forums.
For some enquiries you may prefer to use email.
Please contact me at mark.harman.apps@gmail.com.
+
Please note that I get a lot of emails
for Open Camera these days - I try to reply as many as I can, but this is not always feasible. I do however
read every email and forum post.
diff --git a/_docs/history.html b/_docs/history.html
index 2ff449e9d6f6903943b03f7855390cc4627bed0f..1e08a256c5c225de62ce3fd41262f4ee6bff6022 100644
--- a/_docs/history.html
+++ b/_docs/history.html
@@ -24,16 +24,15 @@
-
+
+
+
@@ -48,6 +47,132 @@
+Version 1.50.1 (2022/06/08)
+
+FIXED Crash on OPPO devices for old camera API introduced in 1.50.
+
+Version 1.50 (2022/06/04)
+
+FIXED HDR photos came out black on some Samsung Galaxy devices with Android 12.
+FIXED Problems with flash on Camera2 API (Samsung Galaxy, OnePlus, Pixel 6 Pro). Galaxy and
+ OnePlus devices therefore no longer default to using the "alternative flash method".
+FIXED Problems with expo, HDR and long manual exposures on some devices (e.g., Pixel 6 Pro).
+FIXED Granting only approximate location permission on Android 12 would turn geotagging option
+ back off.
+FIXED On-screen text looked strange on Android 12.
+FIXED Gallery icon overlapped with navigation bar if using widescreen resolution with UI in left
+ or right handed mode.
+ADDED Support for Android 12's camera extensions API. When using Camera2 API, on selected devices
+ advanced photo modes are now available (e.g., Night on Pixel 6 Pro; Night, Bokeh and Beauty
+ on some Galaxy devices).
+ADDED Improved support for devices with multiple camera devices, where extra cameras are exposed
+ via zooming in and out (e.g., Pixel 5/6).
+ADDED New debug option Settings/Photo settings/"Enable dummy capture HDR/expo fix". Please enable
+ this if you are having problems with HDR or expo bracketing mode on Samsung Galaxy devices
+ with Android 11+ (specifically if some expo images come out with the same exposures).
+UPDATED Removed "use addresses" and "say cheese" options. Sorry about that, but
+ this is due to new data privacy requirements on Google Play: although
+ these used standard Android APIs, information was not available for these APIs to satisfy
+ data privacy requirements.
+UPDATED Now targetting Android 12. For remote control device options, new bluetooth permissions are
+ used instead of requiring location permission.
+UPDATED Move gallery icon slightly to avoid overlapping with Android 12 camera privacy icon.
+UPDATED Made pinch zoom smoother, to allow finer control.
+
+Version 1.49.2 (2022/01/13)
+
+FIXED Dialog for "Save settings" shouldn't allow multiple lines.
+FIXED Crash for NR photo mode on some devices since version 1.49.
+UPDATED Switched to using AppCompat AppCompatActivity.
+UPDATED Photo stamp custom text now uses AppCompat libraries to support latest emoji.
+UPDATED Made appearance of info "toasts" more consistent.
+
+Version 1.49.1 (2021/09/20)
+
+FIXED Crop guides weren't drawn correctly in portrait orientation in 1.49.
+FIXED Diagonals grid wasn't drawn correctly in portrait orientation in 1.49.
+
+Version 1.49 (2021/09/07)
+
+FIXED Crash when failing to save photos/videos with mediastore (Android 10+ if not using Storage
+ Access Framework).
+FIXED Crash related to original camera API.
+FIXED Crash when using photo stamp with auto-level when angle is zero.
+FIXED Couldn't exit immersive mode on Android 11.
+FIXED Behaviour where widescreen preview aspect ratios show under on-screen navigation bar wasn't
+ working properly on Android 11.
+FIXED Manual white balance had inverted effect.
+FIXED Video subtitles file didn't work properly when video file restarted due to max filesize.
+FIXED Taking a photo in RAW only, then clicking on the gallery thumbnail would sometimes
+ incorrectly open an earlier non-RAW photo or video.
+FIXED Corrected pitch and compass line lengths for portrait vs landscape orientations.
+FIXED Single and double tap options to take photo weren't working correctly in panorama mode.
+FIXED When using manual ISO seekbar, sometimes incorrect ISO button would be highlighted.
+UPDATED Now supports portrait and landscape system orientations, rather than being locked to
+ landscape system orientation.
+UPDATED Double tap to take photo option no longer performs a touch to focus, this now only happens
+ from a single tap.
+UPDATED Improved performance when opening camera and clicking on gallery icon (mainly relevant for
+ using Storage Access Framework with Android 10+ when save folder has large number of files).
+UPDATED Set max preview exposure time to be 1/5s instead of 1/12s, for when using manual exposure.
+UPDATED Support longer exposure time (1/5s) on some Samsung Galaxy S devices.
+UPDATED Improvements to brightness levels for Noise Reduction, DRO and HDR photo modes (images
+ coming out too dark in some cases).
+UPDATED Improvement to Noise Reduction photo mode quality (improved ability to distinguish noise
+ from ghosting effects).
+UPDATED Improvement to Noise Reduction photo mode to avoid overexposing in lower light scenes.
+UPDATED Improved choosing when to use 8 images for Noise Reduction photo mode.
+UPDATED Optimisations for DRO and NR photo modes on Samsung devices.
+UPDATED Accessibility improvement, set hints for EditTexts.
+UPDATED Now targetting Android 11. Due to changes in Android 11 this means "video subtitles" option
+ is now only available when saving with
+ Settings/More camera controls/"Storage Access Framework" enabled.
+UPDATED Updated some user interface icons.
+
+Version 1.48.3 (2020/11/20)
+
+FIXED Possible crash for panorama if failing to crop due to poor transformations; now fails
+ gracefully.
+FIXED Crash on EXTERNAL devices with Camera2 API that didn't support querying the view angles.
+FIXED Photos would sometimes fail to save on some devices with Storage Access Framework, when some
+ options were enabled (options like DRO, HDR, auto-level, photostamp that require
+ post-processing; custom Exif tags like artist or copyright; or when using geotagging with
+ Camera2 API).
+FIXED Fix for HDR scenes with both very bright and very dark regions, result would be over
+ exposed.
+FIXED Fixed possible misalignment for HDR scenes with very bright or very dark images.
+FIXED Corrupt videos could be left over if video failed to start.
+FIXED Possible problem taking photos on some devices with LIMITED Camera2 API support.
+FIXED Possible problem with default edge mode and noise reduction mode behaviours on some devices
+ with LIMITED Camera2 API support.
+FIXED UI would become sluggish if camera or storage permission denied with "Don't ask again".
+UPDATED Now supporting "scoped storage" on Android 10+. This means storage permission is no longer
+ required on Android 10+. However this means the following changes:
+ * Saving outside of DCIM/ is no longer possible unless using the Storage Access Framework
+ option. If you had set up a custom save folder outside of DCIM/ and are on Android 10+,
+ it will be reset to the default DCIM/OpenCamera/ folder. If you want to continue saving
+ outside of DCIM/, you can enable
+ Settings/More camera controls/"Use Storage Access Framework" and choose a new folder.
+ * If using Video subtitles option, then the .SRT files will show up in gallery
+ applications, unless Settings/More camera controls/"Use Storage Access Framework" is
+ enabled.
+ Note that these changes are required due to changes being made in Android that applications
+ are required to support.
+UPDATED Use seekbar for more settings (audio control sensitivity, image quality, photo stamp font
+ size).
+UPDATED Debug XML files for panorama now saved in Android/data/net.sourceforge.opencamera/files/.
+UPDATED Camera now closed when in settings or preview otherwise in background.
+
+Version 1.48.2 (2020/07/12)
+
+FIXED Manual focus and focus bracketing seekbars weren't being hidden when in immersive mode.
+FIXED Video subtitles would stop before end of video on some devices when using Storage Access
+ Framework.
+UPDATED Switched to AndroidX support library.
+UPDATED Artist, Copyright exif tags option now supported for devices running Android 6 or earlier.
+UPDATED Selecting remote device type for Bluetooth remote control now calls Open Camera's
+ DeviceScanner directly; DeviceScanner activity no longer exported.
+
Version 1.48.1 (2020/05/02)
FIXED Crash on devices with Camera2 API where camera reports no picture, video or preview
diff --git a/_docs/index.html b/_docs/index.html
index 6f5829bd1d450466e8d7478ceaa20a45eee76fe3..d3bd47db00de4b1371990a0485ceddf6fb4ad970 100644
--- a/_docs/index.html
+++ b/_docs/index.html
@@ -16,6 +16,13 @@
}
-->
+
+
@@ -106,7 +112,7 @@ browsers -->
Option to auto-level so your pictures are perfectly level no matter what.
Expose your camera's functionality: support for scene modes, color effects, white balance, ISO, exposure compensation/lock, selfie with "screen flash", HD video and more.
@@ -213,19 +215,27 @@ Also see "Can I use the Open Camera source code in my app?" under the Apache license version 2.0 (some cases include modifications, no need to credit me).
In particular:
baseline_add_a_photo_white_48.png,
- baseline_bluetooth_white_48.png, baseline_check_white_48.png, baseline_close_white_48.png, baseline_filter_vintage_white_48.png,
+ baseline_bedtime_white_48.png
+ baseline_bluetooth_white_48.png, baseline_check_white_48.png, baseline_close_white_48.png,
+ baseline_delete_white_48.png,
+ baseline_face_retouching_natural_white_48.png,
+ baseline_filter_vintage_white_48.png,
baseline_folder_open_white_48.png,
baseline_highlight_white_48.png,
baseline_panorama_horizontal_white_48.png,
baseline_photo_library_white_48.png,
+ baseline_portrait_white_48.png,
baseline_remove_red_eye_white_48.png,
baseline_rotate_left_white_48.png, baseline_rotate_right_white_48.png,
baseline_shutter_speed_white_48.png,
+ baseline_switch_camera_white_48.png,
baseline_text_fields_red_48.png (modified from baseline_text_fields_white_48), baseline_text_fields_white_48.png,
exposure_locked.png (modified from baseline_lock_white_48 and ic_exposure_white_48dp),
exposure_unlocked.png (modified from baseline_lock_open_white_48 and ic_exposure_white_48dp),
- flash_auto.png (from ic_action_flash_automatic), flash_off.png (from ic_action_flash_off),
+ flash_auto.png (from baseline_flash_auto_white_48), flash_off.png (from baseline_flash_off_white_48),
flash_on.png (from ic_action_flash_on),
+ focus_mode_continuous_picture.png and focus_mode_continuous_video.png (from baseline_loop_white_48),
+ focus_mode_infinity (from baseline_loop_white_48),
focus_mode_locked.png (modified from baseline_lock_white_48),
ic_burst_mode_white_48dp.png, ic_colorize_white_48dp.png,
ic_exposure_red_48dp.png, ic_exposure_white_48dp.png, ic_face_red_48dp.png (modified from ic_face_white_48dp), ic_face_white_48dp.png,
@@ -243,15 +253,13 @@ Also see "Can I use the Open Camera source code in my app?" under the Modified versions of some of these icons are also used on this website.
diff --git a/_docs/info.html b/_docs/info.html
index 11f210a69d7f48360757d0ee8e0d611787e7eaee..91c2124f99d4e4ab60c7d8a3bb5202cbda9dbb39 100644
--- a/_docs/info.html
+++ b/_docs/info.html
@@ -24,16 +24,15 @@
-
+
+
+
@@ -60,50 +59,13 @@
-
Privacy policy: Please note that some information such as name, email, amount will be shared to me by PayPal. All PayPal transactions are subject to the PayPal Privacy Policy (alternatively known as PayPal's Privacy Statement).
+
I am not currently accepting donations. Thanks to those who have supported me in the past!
The APK files are also available from
diff --git a/_docs/privacy_oc.html b/_docs/privacy_oc.html
index 38972821d7f243aa56c221b1deab5488cb2991dd..48728d312ceeef5ab4edf66267619d0ebd875a1c 100644
--- a/_docs/privacy_oc.html
+++ b/_docs/privacy_oc.html
@@ -24,16 +24,15 @@
-
+
+
+
@@ -51,44 +50,65 @@
+
Open Camera is developed by Mark Harman.
+
Open Camera accesses and records camera sensor and microphone data, which is used for the purpose
of taking photos and recording videos, to fulfil its purpose as a camera. Microphone permission is also used for the optional "Audio control" options.
-
Open Camera requires permission to "access photos, media and files on your devices", as this permission is required for Android to
-save resultant files such as photos and videos to your device.
+
Open Camera requires permission (at least for Android 9 and earlier, or using versions of Open Camera older than 1.48.3) to
+ "access photos, media and files on your devices" (storage permission), as this permission is required for Android to save resultant files such as photos and videos to your device.
+
+
Location permission is requested in order to deliver the optional geotagging features (for photos and videos, including stamp and subtitles options).
+ When relevant option(s) are enabled, your device location will be stored in photo/video/subtitle files.
-
Location permission is required for the optional geotagging features (for photos and videos, including stamp and subtitles options).
- When relevant option(s) are enabled, your device location will be stored in photo/video/subtitle files.
- Location permission is also required to connect to Bluetooth remote control devices.
+
Bluetooth permissions are used to allow the optional feature to discover and connect to Bluetooth LE remote control devices;
+ the Bluetooth remote control feature also requires location permission (on Android 11 or earlier) or
+ Nearby Devices permission (on Android 12 or later).
-
Bluetooth permission is required for communicating with some supported Bluetooth remote control devices.
+
Resultant data such as photos or videos can be shared with
+ other apps if you use the share option in Open Camera, or when Open Camera is called by
+ another app on your device, or when you use the Storage Access Framework option to save
+ to another app or service.
+
+
Data handling procedures, data retention and deletion policies: Open Camera
+ does not transmit personal or sensitive information to me.
Since Open Camera also uses operating system APIs, you should review relevant privacy policies
such as for your device, manufacturer, operating system and/or Google accounts. For example:
-
The optional voice control option uses the Android
+
For versions 1.49.2 or earlier: the optional voice control option used the Android
speech recognition service.
When enabled, audio data is likely to be sent to remote servers by Android to perform speech recognition.
+ This is subject to the Data Processing Addendum for Products where Google is a Data Processor,
+ located at
+ https://privacy.google.com/businesses/gdprprocessorterms/ , as updated from time to time.
+ This option is no longer available in version 1.50 onwards.
-
The "addresses" option for photo stamp or video subtitles uses the Android
+
For versions 1.49.2 or earlier: The "addresses" option for photo stamp or video subtitles used the Android
Geocoder API.
- When enabled, this requires that your device transmits location data across the Internet to a
+ When this option is enabled, in order to deliver this functionality the API transmits your device location data across the Internet to a
third party (which may depend on what "backend services" are installed on your device).
+ This option is no longer available in version 1.50 onwards.
Apps/services such as cloud services on your device may auto-upload photos and videos that are saved on your device.
+
If you have inquiries about my privacy policy, please contact me by email at
+ mark.harman.apps@gmail.com.
+
Although Open Camera is ad-free, the Open Camera website has ads via Google Adsense: Third party vendors, including Google, use cookies to
serve ads based on a user's previous visits to this website or other websites. Google's use of advertising cookies enables it and
its partners to serve ads based on people's visit to this sites and/or other sites on the Internet. You may opt out of personalised
-advertising by visiting Google's Ads Settings. Alternatively, you can
-opt out of some third-party vendors' uses of cookies for personalised advertising by visiting
+advertising by visiting Google's Ads Settings. The cookies of other third-party
+vendors or ad networks may also be used to serve ads. You can opt out of some third-party vendors' uses of cookies for personalised advertising by visiting
www.aboutads.info.
Update: I have instructed Google to not display personalised ads to users in the EEA.
+
Note that cookies are still used for serving even non-personalised ads.
+
The Open Camera website also uses Google Analytics which uses cookies, please see their
Privacy Policy for more details.
diff --git a/_docs/stylesheet.css b/_docs/stylesheet.css
index fbf030952275aef112e0a70c5707a45e4723c551..804a107ded9d435860fab932acf0d1ea9142600d 100644
--- a/_docs/stylesheet.css
+++ b/_docs/stylesheet.css
@@ -1,4 +1,5 @@
body {
color: #000000;
background-color: rgb(245,236,220);
+ font-family: Tahoma, Geneva, sans-serif;
}
diff --git a/_docs/switch_camera.png b/_docs/switch_camera.png
deleted file mode 100644
index c5f8a1860822bb8f3a039406330616def0f5df0f..0000000000000000000000000000000000000000
Binary files a/_docs/switch_camera.png and /dev/null differ
diff --git a/_docs/trash.png b/_docs/trash.png
deleted file mode 100644
index 81493dd2d8de1217576d6dffc5efe19352c2294b..0000000000000000000000000000000000000000
Binary files a/_docs/trash.png and /dev/null differ
diff --git a/androidx_LICENSE-2.0.txt b/androidx_LICENSE-2.0.txt
new file mode 100644
index 0000000000000000000000000000000000000000..d645695673349e3947e8e5ae42332d0ac3164cd7
--- /dev/null
+++ b/androidx_LICENSE-2.0.txt
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/app/build.gradle b/app/build.gradle
index a74a36609789c4a26c2b7c04cc8bb14504f0408f..890003d1b52a3e20765587aaba075c457998537c 100644
--- a/app/build.gradle
+++ b/app/build.gradle
@@ -1,7 +1,7 @@
apply plugin: 'com.android.application'
android {
- compileSdkVersion 30
+ compileSdkVersion 31
compileOptions.encoding = 'UTF-8'
compileOptions {
@@ -12,7 +12,7 @@ android {
defaultConfig {
applicationId "foundation.e.camera"
minSdkVersion 21
- targetSdkVersion 30
+ targetSdkVersion 31
renderscriptTargetApi 21
//renderscriptSupportModeEnabled true // don't use support library as it bloats the APK, and we don't need pre-4.4 support
@@ -34,16 +34,15 @@ android {
}
}
- lintOptions {
- abortOnError false
- checkReleaseBuilds false
- abortOnError false
- }
// needed to use android.test package (ActivityInstrumentationTestCase2 etc) when targetting sdk 28 (Android 9) -
// see https://developer.android.com/training/testing/set-up-project
useLibrary 'android.test.runner'
useLibrary 'android.test.base'
+ lint {
+ abortOnError false
+ checkReleaseBuilds false
+ }
//useLibrary 'android.test.mock'
aaptOptions {
@@ -58,5 +57,12 @@ dependencies {
implementation 'androidx.legacy:legacy-support-v4:1.0.0'
implementation "androidx.constraintlayout:constraintlayout:2.1.3"
implementation "org.greenrobot:eventbus:3.3.1"
- testImplementation 'junit:junit:4.13'
+ androidTestImplementation 'androidx.test.ext:junit:1.1.3'
+
+ // appcompat version must be 1.4.0 or later to satisfy emoji policy!
+ implementation 'androidx.appcompat:appcompat:1.4.1'
+
+ implementation 'androidx.exifinterface:exifinterface:1.3.3'
+
+ testImplementation 'junit:junit:4.13.1'
}
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/AvgTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/AvgTests.java
index a502dd85c51af99c130d9270310c29efd19014cb..6e5334ce5364f4b3c7716084adf5af1155cd97c8 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/AvgTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/AvgTests.java
@@ -9,6 +9,7 @@ public class AvgTests {
* To use these tests, the testdata/ subfolder should be manually copied to the test device in the DCIM/testOpenCamera/
* folder (so you have DCIM/testOpenCamera/testdata/). We don't use assets/ as we'd end up with huge APK sizes which takes
* time to transfer to the device every time we run the tests.
+ * On Android 10+, scoped storage permission needs to be given to Open Camera for the DCIM/testOpenCamera/ folder.
*/
public static Test suite() {
TestSuite suite = new TestSuite(MainTests.class.getName());
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRNTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRNTests.java
index 7283de7a3f76b401af194b7716ed4d7e74a83a10..4c82cfd8a2d3dfcd2e48639e456c5b5a66161130 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRNTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRNTests.java
@@ -9,6 +9,7 @@ public class HDRNTests {
* To use these tests, the testdata/ subfolder should be manually copied to the test device in the DCIM/testOpenCamera/
* folder (so you have DCIM/testOpenCamera/testdata/). We don't use assets/ as we'd end up with huge APK sizes which takes
* time to transfer to the device every time we run the tests.
+ * On Android 10+, scoped storage permission needs to be given to Open Camera for the DCIM/testOpenCamera/ folder.
*/
public static Test suite() {
TestSuite suite = new TestSuite(MainTests.class.getName());
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRTests.java
index 9efcae467926839fd51b69362dd5037a541d1c25..5c6b5622412ada30e95696045ee975052da70aa8 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/HDRTests.java
@@ -9,6 +9,7 @@ public class HDRTests {
* To use these tests, the testdata/ subfolder should be manually copied to the test device in the DCIM/testOpenCamera/
* folder (so you have DCIM/testOpenCamera/testdata/). We don't use assets/ as we'd end up with huge APK sizes which takes
* time to transfer to the device every time we run the tests.
+ * On Android 10+, scoped storage permission needs to be given to Open Camera for the DCIM/testOpenCamera/ folder.
*/
public static Test suite() {
TestSuite suite = new TestSuite(MainTests.class.getName());
@@ -74,6 +75,10 @@ public class HDRTests {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR55"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR56"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR57"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR58"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR59"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR60"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testHDR61"));
return suite;
}
}
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/MainActivityTest.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/MainActivityTest.java
index 95a0289533fe8d3d1fd550a317eea9b6ec40afb9..d59715747073e88d9abd282f733e48f42a1ad344 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/MainActivityTest.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/MainActivityTest.java
@@ -1,6 +1,8 @@
package net.sourceforge.opencamera.test;
import java.io.File;
+import java.io.FileDescriptor;
+import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.OutputStream;
@@ -30,16 +32,19 @@ import net.sourceforge.opencamera.SaveLocationHistory;
import net.sourceforge.opencamera.cameracontroller.CameraController;
import net.sourceforge.opencamera.preview.Preview;
import net.sourceforge.opencamera.ui.FolderChooserDialog;
+import net.sourceforge.opencamera.ui.MainUI;
import net.sourceforge.opencamera.ui.PopupView;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
+import android.content.ContentUris;
+import android.content.ContentValues;
import android.content.Intent;
import android.content.SharedPreferences;
//import android.content.res.AssetManager;
+import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
-//import android.graphics.BitmapFactory;
import android.graphics.Color;
import android.graphics.Matrix;
import android.graphics.Point;
@@ -49,12 +54,14 @@ import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.TonemapCurve;
import android.location.Location;
import android.media.CamcorderProfile;
-import android.media.ExifInterface;
+import androidx.exifinterface.media.ExifInterface;
import android.media.MediaScannerConnection;
+import android.net.Uri;
import android.os.Build;
import android.os.Environment;
-//import android.os.Environment;
+import android.os.ParcelFileDescriptor;
import android.preference.PreferenceManager;
+import android.provider.DocumentsContract;
import android.provider.MediaStore;
import android.renderscript.Allocation;
import androidx.annotation.RequiresApi;
@@ -69,6 +76,8 @@ import android.widget.SeekBar;
import android.widget.TextView;
import android.widget.ZoomControls;
+// ignore warning about "Call to Thread.sleep in a loop", this is only test code
+@SuppressWarnings("BusyWait")
public class MainActivityTest extends ActivityInstrumentationTestCase2 {
private static final String TAG = "MainActivityTest";
private MainActivity mActivity = null;
@@ -81,6 +90,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 1000) {
assertTrue(mPreview.isPreviewBitmapEnabled());
}
@@ -2735,8 +2784,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 1000) {
assertFalse(mPreview.isPreviewBitmapEnabled());
assertFalse(mPreview.refreshPreviewBitmapTaskIsRunning());
@@ -2797,9 +2848,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 mediaFilesinSaveFolder(Uri baseUri, String bucket_id, UriType uri_type) {
+ List files = new ArrayList<>();
+ final int column_name_c = 0; // filename (without path), including extension
+
+ String [] projection;
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ projection = new String[] {MediaStore.Images.ImageColumns.DISPLAY_NAME};
+ break;
+ case MEDIASTORE_VIDEOS:
+ projection = new String[] {MediaStore.Video.VideoColumns.DISPLAY_NAME};
+ break;
+ case STORAGE_ACCESS_FRAMEWORK:
+ projection = new String[] {DocumentsContract.Document.COLUMN_DISPLAY_NAME};
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
+
+ String selection = "";
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ selection = MediaStore.Images.ImageColumns.BUCKET_ID + " = " + bucket_id;
+ break;
+ case MEDIASTORE_VIDEOS:
+ selection = MediaStore.Video.VideoColumns.BUCKET_ID + " = " + bucket_id;
+ break;
+ case STORAGE_ACCESS_FRAMEWORK:
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
+ Log.d(TAG, "selection: " + selection);
+
+ Cursor cursor = mActivity.getContentResolver().query(baseUri, projection, selection, null, null);
+ if( cursor != null && cursor.moveToFirst() ) {
+ Log.d(TAG, "found: " + cursor.getCount());
+
+ do {
+ String name = cursor.getString(column_name_c);
+ files.add(name);
+ }
+ while( cursor.moveToNext() );
+ }
+
+ if( cursor != null ) {
+ cursor.close();
+ }
+
+ return files;
+ }
+
+ /** Returns an array of filenames (not including full path) in the current save folder.
+ */
+ private String [] filesInSaveFolder() {
+ Log.d(TAG, "filesInSaveFolder");
+ if( MainActivity.useScopedStorage() ) {
+ List files = new ArrayList<>();
+ if( mActivity.getStorageUtils().isUsingSAF() ) {
+ // See documentation for StorageUtils.getLatestMediaSAF() - for some reason with scoped storage when not having READ_EXTERNAL_STORAGE,
+ // we can't query the mediastore for files saved via SAF!
+ Uri treeUri = mActivity.getStorageUtils().getTreeUriSAF();
+ Uri baseUri = DocumentsContract.buildChildDocumentsUriUsingTree(treeUri, DocumentsContract.getTreeDocumentId(treeUri));
+ files.addAll( mediaFilesinSaveFolder(baseUri, null, UriType.STORAGE_ACCESS_FRAMEWORK) );
+ }
+ else {
+ String save_folder = mActivity.getStorageUtils().getImageFolderPath();
+ String bucket_id = String.valueOf(save_folder.toLowerCase().hashCode());
+ files.addAll( mediaFilesinSaveFolder(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, bucket_id, UriType.MEDIASTORE_IMAGES) );
+ files.addAll( mediaFilesinSaveFolder(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, bucket_id, UriType.MEDIASTORE_VIDEOS) );
+ }
+
+ if( files.size() == 0 ) {
+ return null;
+ }
+ else {
+ return files.toArray(new String[0]);
+ }
+ }
+ else {
+ File folder = mActivity.getImageFolder();
+ File [] files = folder.listFiles();
+ if( files == null )
+ return null;
+ String [] filenames = new String[files.length];
+ for(int i=0;i 1 ) {
@@ -4738,11 +4963,13 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= Build.VERSION_CODES.R ? 0 : 1 // boo, Android 11 doesn't allow video subtitles to be saved with mediastore API!
+ );
+ }
+
+ /** Tests video subtitles option, when using Storage Access Framework.
+ */
+ public void testTakeVideoSubtitlesSAF() throws InterruptedException {
+ Log.d(TAG, "testTakeVideoSubtitlesSAF");
+
+ setToDefault();
+ {
+ SharedPreferences settings = PreferenceManager.getDefaultSharedPreferences(mActivity);
+ SharedPreferences.Editor editor = settings.edit();
+ editor.putString(PreferenceKeys.VideoSubtitlePref, "preference_video_subtitle_yes");
+ editor.putBoolean(PreferenceKeys.UsingSAFPreferenceKey, true);
+ editor.putString(PreferenceKeys.SaveLocationSAFPreferenceKey, "content://com.android.externalstorage.documents/tree/primary%3ADCIM%2FOpenCamera");
+ editor.apply();
+ updateForSettings();
+ }
+
subTestTakeVideo(false, false, false, false, null, 5000, false, 1);
}
/** Tests video subtitles option, including GPS - also tests losing the connection.
+ * Also test with Storage Access Framework, so this can run on Android 11+.
*/
- public void testTakeVideoSubtitlesGPS() throws InterruptedException {
- Log.d(TAG, "testTakeVideoSubtitlesGPS");
+ public void testTakeVideoSubtitlesGPSSAF() throws InterruptedException {
+ Log.d(TAG, "testTakeVideoSubtitlesGPSSAF");
setToDefault();
{
@@ -5892,6 +6169,8 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 3000 - time_tol_ms );
assertTrue( video_time <= 3000 + time_tol_ms );
@@ -5999,7 +6278,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 3000 - time_tol_ms );
assertTrue( video_time <= 3000 + time_tol_ms );
@@ -6029,7 +6308,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 6000 - time_tol_ms );
assertTrue( video_time <= 6000 + time_tol_ms );
@@ -6079,7 +6358,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 3000 - time_tol_ms );
assertTrue( video_time <= 3000 + time_tol_ms );
@@ -6109,7 +6388,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= 3000 - time_tol_ms );
assertTrue( video_time <= 3000 + time_tol_ms );
@@ -6297,12 +6576,12 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= video_time_s );
}
Log.d(TAG, "video recording now stopped");
@@ -6382,6 +6661,11 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= Build.VERSION_CODES.O ) {
assertTrue(mPreview.isVideoRecording());
+
+ long video_time = mPreview.getVideoTime(false);
+ long video_time_this_file = mPreview.getVideoTime(true);
+ assertEquals(video_time, video_time_this_file);
+
Log.d(TAG, "wait");
try {
Thread.sleep(10000);
@@ -6400,10 +6684,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= video_time_s );
}
Log.d(TAG, "video recording now stopped - wait for restart");
@@ -6426,7 +6710,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= video_time_s );
@@ -6480,6 +6764,13 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= Build.VERSION_CODES.O ) {
assertTrue( n_new_files >= 2 );
}
+
+ // if we've restarted, the total video time should be longer than the video time for the most recent file
+ long video_time = mPreview.getVideoTime(false);
+ long video_time_this_file = mPreview.getVideoTime(true);
+ Log.d(TAG, "video_time: " + video_time);
+ Log.d(TAG, "video_time_this_file: " + video_time_this_file);
+ assertTrue(video_time > video_time_this_file + 1000);
}
/** Max filesize is for ~4.5s, and max duration is 5s, check we only get 1 video.
@@ -6508,10 +6799,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= video_time_s );
}
Log.d(TAG, "video recording now stopped - check we don't restart");
@@ -6577,6 +6868,10 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 1 ) {
@@ -8744,6 +9113,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 " + zoom1);
+ assertTrue(zoom1 >= zoom0);
+ }
+
if( mPreview.supportsFocus() ) {
assertFalse(mPreview.hasFocusArea());
assertNull(mPreview.getCameraController().getFocusAreas());
@@ -9267,7 +9716,8 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= CameraController.ISO_FOR_DARK )
+ if( mActivity.getPreview().getCameraController().captureResultHasIso() && HDRProcessor.sceneIsLowLight( mActivity.getPreview().getCameraController().captureResultIso(), mActivity.getPreview().getCameraController().captureResultExposureTime() ) )
assertEquals(CameraController.N_IMAGES_NR_DARK_LOW_LIGHT, mActivity.getPreview().getCameraController().getBurstTotal());
// reset
mActivity.getApplicationInterface().setNRMode("preference_nr_mode_normal");
@@ -11037,10 +11653,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 inputs = new ArrayList<>();
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR58/IMG_20190911_210146_0.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR58/IMG_20190911_210146_1.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR58/IMG_20190911_210146_2.jpg") );
+
+ HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR58_output.jpg", false, 1250, 1000000000L/10);
+ //HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR58_output.jpg", false, 1250, 1000000000L/10, HDRProcessor.TonemappingAlgorithm.TONEMAPALGORITHM_CLAMP);
+
+ checkHistogramDetails(hdrHistogramDetails, 11, 119, 255);
+ }
+
+ /** Tests HDR algorithm on test samples "testHDR59".
+ */
+ public void testHDR59() throws IOException, InterruptedException {
+ Log.d(TAG, "testHDR59");
+
+ setToDefault();
+
+ // list assets
+ List inputs = new ArrayList<>();
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR59/IMG_20190911_210154_0.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR59/IMG_20190911_210154_1.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR59/IMG_20190911_210154_2.jpg") );
+
+ HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR59_output.jpg", false, 1250, 1000000000L/10);
+ //HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR59_output.jpg", false, 1250, 1000000000L/10, HDRProcessor.TonemappingAlgorithm.TONEMAPALGORITHM_CLAMP);
+
+ //checkHistogramDetails(hdrHistogramDetails, 0, 75, 255);
+ }
+
+ /** Tests HDR algorithm on test samples "testHDR60".
+ */
+ public void testHDR60() throws IOException, InterruptedException {
+ Log.d(TAG, "testHDR60");
+
+ setToDefault();
+
+ // list assets
+ List inputs = new ArrayList<>();
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR60/IMG_20200507_020319_0.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR60/IMG_20200507_020319_1.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR60/IMG_20200507_020319_2.jpg") );
+
+ HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR60_output.jpg", false, 491, 1000000000L/10);
+ //HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR60_output.jpg", false, 491, 1000000000L/10, HDRProcessor.TonemappingAlgorithm.TONEMAPALGORITHM_CLAMP);
+
+ //checkHistogramDetails(hdrHistogramDetails, 0, 75, 255);
+ }
+
+ /** Tests HDR algorithm on test samples "testHDR61".
+ */
+ public void testHDR61() throws IOException, InterruptedException {
+ Log.d(TAG, "testHDR61");
+
+ setToDefault();
+
+ // list assets
+ List inputs = new ArrayList<>();
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR61/IMG_20191111_145230_0.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR61/IMG_20191111_145230_1.jpg") );
+ inputs.add( getBitmapFromFile(hdr_images_path + "testHDR61/IMG_20191111_145230_2.jpg") );
+
+ HistogramDetails hdrHistogramDetails = subTestHDR(inputs, "testHDR61_output.jpg", false, 50, 1000000000L/5025);
+
+ checkHistogramDetails(hdrHistogramDetails, 0, 86, 254);
+
+ int [] exp_offsets_x = {0, 0, 1};
+ int [] exp_offsets_y = {0, 0, -2};
+ checkHDROffsets(exp_offsets_x, exp_offsets_y);
+ }
+
/** Tests HDR algorithm on test samples "testHDRtemp".
- * Used for one-off testing, or to recreate HDR images from the base exposures to test an updated alorithm.
+ * Used for one-off testing, or to recreate HDR images from the base exposures to test an updated algorithm.
* The test images should be copied to the test device into DCIM/testOpenCamera/testdata/hdrsamples/testHDRtemp/ .
*/
public void testHDRtemp() throws IOException, InterruptedException {
@@ -13505,7 +14286,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 times = new ArrayList<>();
long time_s = System.currentTimeMillis();
- HDRProcessor.AvgData avg_data = mActivity.getApplicationInterface().getHDRProcessor().processAvg(bitmap0, bitmap1, avg_factor, iso, zoom_factor);
+ HDRProcessor.AvgData avg_data = mActivity.getApplicationInterface().getHDRProcessor().processAvg(bitmap0, bitmap1, avg_factor, iso, exposure_time, zoom_factor);
Allocation allocation = avg_data.allocation_out;
times.add(System.currentTimeMillis() - time_s);
// processAvg recycles both bitmaps
@@ -13528,7 +14309,7 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2= Build.VERSION_CODES.Q ?
+ MediaStore.Images.Media.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY) :
+ MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
+
+ // first try to delete pre-existing image
+ Uri old_uri = getUriFromName(folder, name);
+ if( old_uri != null ) {
+ Log.d(TAG, "delete: " + old_uri);
+ mActivity.getContentResolver().delete(old_uri, null, null);
+ }
+
+ contentValues = new ContentValues();
+ contentValues.put(MediaStore.Images.Media.DISPLAY_NAME, name);
+ String extension = name.substring(name.lastIndexOf("."));
+ String mime_type = mActivity.getStorageUtils().getImageMimeType(extension);
+ Log.d(TAG, "mime_type: " + mime_type);
+ contentValues.put(MediaStore.Images.Media.MIME_TYPE, mime_type);
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ String relative_path = Environment.DIRECTORY_DCIM + File.separator;
+ Log.d(TAG, "relative_path: " + relative_path);
+ contentValues.put(MediaStore.Images.Media.RELATIVE_PATH, relative_path);
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 1);
+ }
+
+ uri = mActivity.getContentResolver().insert(folder, contentValues);
+ Log.d(TAG, "saveUri: " + uri);
+ if( uri == null ) {
+ throw new IOException();
+ }
+ outputStream = mActivity.getContentResolver().openOutputStream(uri);
+ }
+ else {
+ file = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM) + File.separator + name);
+ outputStream = new FileOutputStream(file);
+ }
+
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, outputStream);
outputStream.close();
- mActivity.getStorageUtils().broadcastFile(file, true, false, true);
+
+ if( MainActivity.useScopedStorage() ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ contentValues.clear();
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 0);
+ mActivity.getContentResolver().update(uri, contentValues, null, null);
+ }
+ }
+ else {
+ mActivity.getStorageUtils().broadcastFile(file, true, false, true);
+ }
}
/**
@@ -15782,7 +16682,6 @@ public class MainActivityTest extends ActivityInstrumentationTestCase2 inputs, String output_name, String gyro_debug_info_filename, float panorama_pics_per_screen, float camera_angle_x, float camera_angle_y, float gyro_tol_degrees) throws IOException, InterruptedException {
Log.d(TAG, "subTestPanorama");
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/MainTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/MainTests.java
index 863a0915dd533ce5cd5ff60bf455a77cdead2ff6..782bebb9f8b691bf5364d5f5efc18abb6a141d23 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/MainTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/MainTests.java
@@ -16,6 +16,12 @@ public class MainTests {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSwitchVideo"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLocationSettings"));
// other tests:
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testScopedStorageChecks1"));
+ if( !MainActivityTest.test_camera2 ) {
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testScopedStorageChecks2"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testScopedStorageChecks3"));
+ }
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testScopedStorageChecks4"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPause"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testImmediatelyQuit"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testStartCameraPreviewCount"));
@@ -31,7 +37,9 @@ public class MainTests {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPreviewSize"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPreviewSizeWYSIWYG"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testResolutionMaxMP"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testResolutionBurst"));
+ if( MainActivityTest.test_camera2 ) {
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testResolutionBurst"));
+ }
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testAutoFocus"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testAutoFocusCorners"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPopup"));
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/Nexus7Tests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/Nexus7Tests.java
index 3494ec2ae3534fc35b6419f3e7c14d025a794c58..079a2ced56a476390f5773201230a701d69ab5a5 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/Nexus7Tests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/Nexus7Tests.java
@@ -1,5 +1,7 @@
package net.sourceforge.opencamera.test;
+import android.os.Build;
+
import junit.framework.Test;
import junit.framework.TestSuite;
@@ -9,13 +11,24 @@ public class Nexus7Tests {
TestSuite suite = new TestSuite(MainTests.class.getName());
// we run the following tests on the Nexus 7 as a device that supports SAF, but doesn't have Android 7+ (where we use alternative methods for read/writing Exif tags without needing File)
+ // update: we now (as of 1.48.2) use the same codepaths for exif tags for before and after Android 7, but might as well keep these tests here anyway
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoSAF"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPhotoStampSAF"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testDirectionOnSAF"));
+ // tests useful for device with no flash, and only 1 camera
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSwitchVideo"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testFocusFlashAvailability"));
+ // tests for testing Camera2 API with LEGACY Camera2 functionality
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhoto"));
+ if( MainActivityTest.isEmulator() && Build.VERSION.SDK_INT == Build.VERSION_CODES.M ) {
+ // video doesn't work on Android 6 emulator!
+ }
+ else {
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideo"));
+ }
+
return suite;
}
}
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/OldDeviceTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/OldDeviceTests.java
new file mode 100644
index 0000000000000000000000000000000000000000..8e6e1b1759699818496499258855bcca70913e5a
--- /dev/null
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/OldDeviceTests.java
@@ -0,0 +1,39 @@
+package net.sourceforge.opencamera.test;
+
+import junit.framework.Test;
+import junit.framework.TestSuite;
+
+public class OldDeviceTests {
+ // Small set of tests to run on very old devices.
+ public static Test suite() {
+ TestSuite suite = new TestSuite(MainTests.class.getName());
+
+ // put these tests first as they require various permissions be allowed, that can only be set by user action
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSwitchVideo"));
+
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testPause"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSaveModes"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testFocusFlashAvailability"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testGallery"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSettings"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSettingsSaveLoad"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testFolderChooserNew"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testFolderChooserInvalid"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSaveFolderHistory"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testSettingsPrivacyPolicy"));
+
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLocationOn"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhoto"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoAutoLevel"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoAutoLevelLowMemory"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoAutoLevelAngles"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoAutoLevelAnglesLowMemory"));
+
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideo"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSubtitles"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testIntentVideo"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testIntentVideoDurationLimit"));
+
+ return suite;
+ }
+}
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/PanoramaTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/PanoramaTests.java
index 5e3c38d00d1286f2ae4a160bf3a758b0a8e5950d..554ac0b79e667ea01cffce5c68b51caa06682913 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/PanoramaTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/PanoramaTests.java
@@ -9,6 +9,7 @@ public class PanoramaTests {
* To use these tests, the testdata/ subfolder should be manually copied to the test device in the DCIM/testOpenCamera/
* folder (so you have DCIM/testOpenCamera/testdata/). We don't use assets/ as we'd end up with huge APK sizes which takes
* time to transfer to the device every time we run the tests.
+ * On Android 10+, scoped storage permission needs to be given to Open Camera for the DCIM/testOpenCamera/ folder.
*/
public static Test suite() {
TestSuite suite = new TestSuite(MainTests.class.getName());
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/PhotoTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/PhotoTests.java
index fb2a078780dfffdd84cd61e0a47f755f2da9117b..7f09028e0250a55cdc3ec33280d5feeb13026cec 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/PhotoTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/PhotoTests.java
@@ -26,9 +26,6 @@ public class PhotoTests {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhoto"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoContinuous"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoContinuousNoTouch"));
- if( !MainActivityTest.test_camera2 ) {
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoAutoStabilise"));
- }
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoFlashAuto"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoFlashOn"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoFlashTorch"));
@@ -90,9 +87,11 @@ public class PhotoTests {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoHDRPhotoStamp"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoExpo"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanorama"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaMax"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaCancel"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaCancelBySettings"));
+ if( !MainActivityTest.test_camera2 ) {
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaMax"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaCancel"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakePhotoPanoramaCancelBySettings"));
+ }
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testCreateSaveFolder1"));
if( !MainActivityTest.test_camera2 ) {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testCreateSaveFolder2"));
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/TempTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/TempTests.java
new file mode 100644
index 0000000000000000000000000000000000000000..fc808e7c1ac6670be32bf685d5bdde98d1cf2a47
--- /dev/null
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/TempTests.java
@@ -0,0 +1,15 @@
+package net.sourceforge.opencamera.test;
+
+import junit.framework.Test;
+import junit.framework.TestSuite;
+
+public class TempTests {
+ // Dummy test suite for running an arbitrary subset of tests.
+ public static Test suite() {
+ TestSuite suite = new TestSuite(MainTests.class.getName());
+
+ //suite.addTest(TestSuite.createTest(MainActivityTest.class, "testZoom"));
+
+ return suite;
+ }
+}
diff --git a/app/src/androidTest/java/net/sourceforge/opencamera/test/VideoTests.java b/app/src/androidTest/java/net/sourceforge/opencamera/test/VideoTests.java
index aa31823d24781e32192fb9b33e104b09f9650e80..4787ac2baef6b382d8b46ae9d0785d0f68dd9362 100644
--- a/app/src/androidTest/java/net/sourceforge/opencamera/test/VideoTests.java
+++ b/app/src/androidTest/java/net/sourceforge/opencamera/test/VideoTests.java
@@ -8,13 +8,24 @@ public class VideoTests {
public static Test suite() {
TestSuite suite = new TestSuite(MainTests.class.getName());
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideo"));
+ // put these tests first as they require various permissions be allowed, that can only be set by user action:
if( !MainActivityTest.test_camera2 ) {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoAudioControl"));
}
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSAF"));
if( !MainActivityTest.test_camera2 ) {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSubtitles"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSubtitlesGPS"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSubtitlesSAF"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoSubtitlesGPSSAF"));
+ }
+ if( MainActivityTest.test_camera2 ) {
+ // tests for video log profile (but these don't actually record video)
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile1"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile2"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile3"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile1_extra_strong"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile2_extra_strong"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile3_extra_strong"));
}
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testIntentVideo"));
@@ -58,6 +69,9 @@ public class VideoTests {
}
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoTimeLapse"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoForceFailure"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoForceFailureSAF"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoForceIOException"));
+ suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoForceCameraControllerException"));
if( MainActivityTest.test_camera2 ) {
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testVideoLogProfile"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testVideoJTLogProfile"));
@@ -74,15 +88,6 @@ public class VideoTests {
/*suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideoBitrate"));
suite.addTest(TestSuite.createTest(MainActivityTest.class, "testTakeVideo4K"));*/
- // tests for video log profile (but these don't actually record video)
- if( MainActivityTest.test_camera2 ) {
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile1"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile2"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile3"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile1_extra_strong"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile2_extra_strong"));
- suite.addTest(TestSuite.createTest(MainActivityTest.class, "testLogProfile3_extra_strong"));
- }
return suite;
}
}
diff --git a/app/src/main/AndroidManifest.xml b/app/src/main/AndroidManifest.xml
index b5366f48887f8231e940cac4867c1212353b6892..1a9fb772372ca5dcfcf0d0bcfac761f28f6e64c9 100644
--- a/app/src/main/AndroidManifest.xml
+++ b/app/src/main/AndroidManifest.xml
@@ -2,21 +2,32 @@
-
-
-
+
+
+
+
+
+
@@ -34,11 +45,10 @@
@@ -75,11 +85,8 @@
-
-
-
-
@@ -87,17 +94,18 @@
android:name="net.sourceforge.opencamera.TakePhoto"
android:label="@string/take_photo"
android:icon="@mipmap/ic_launcher"
- android:screenOrientation="landscape"
android:configChanges="orientation|screenSize|keyboardHidden"
android:taskAffinity=""
android:excludeFromRecents="true"
+ android:exported="false"
>
+ android:name="net.sourceforge.opencamera.MyWidgetProvider"
+ android:exported="true">
@@ -112,7 +120,8 @@
+ android:name="net.sourceforge.opencamera.MyWidgetProviderTakePhoto"
+ android:exported="true">
@@ -125,7 +134,8 @@
android:name="net.sourceforge.opencamera.MyTileService"
android:icon="@drawable/ic_switch_camera"
android:label="@string/camera"
- android:permission="android.permission.BIND_QUICK_SETTINGS_TILE">
+ android:permission="android.permission.BIND_QUICK_SETTINGS_TILE"
+ android:exported="true">
@@ -135,7 +145,8 @@
android:name="net.sourceforge.opencamera.MyTileServiceVideo"
android:icon="@drawable/ic_switch_video"
android:label="@string/record_video"
- android:permission="android.permission.BIND_QUICK_SETTINGS_TILE">
+ android:permission="android.permission.BIND_QUICK_SETTINGS_TILE"
+ android:exported="true">
@@ -145,11 +156,21 @@
android:name="net.sourceforge.opencamera.MyTileServiceFrontCamera"
android:icon="@drawable/ic_face"
android:label="@string/selfie"
- android:permission="android.permission.BIND_QUICK_SETTINGS_TILE">
+ android:permission="android.permission.BIND_QUICK_SETTINGS_TILE"
+ android:exported="true">
-
+
+
+
+
+
+
+
diff --git a/app/src/main/assets/androidx_LICENSE-2.0.txt b/app/src/main/assets/androidx_LICENSE-2.0.txt
new file mode 100644
index 0000000000000000000000000000000000000000..d645695673349e3947e8e5ae42332d0ac3164cd7
--- /dev/null
+++ b/app/src/main/assets/androidx_LICENSE-2.0.txt
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/app/src/main/java/net/sourceforge/opencamera/AudioListener.java b/app/src/main/java/net/sourceforge/opencamera/AudioListener.java
index 2d5bf4a2335e28634c2bf651b58e43829d926bde..dee516502cf1862edebcab34a2fe2efe9eae9360 100644
--- a/app/src/main/java/net/sourceforge/opencamera/AudioListener.java
+++ b/app/src/main/java/net/sourceforge/opencamera/AudioListener.java
@@ -5,6 +5,8 @@ import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.util.Log;
+import androidx.annotation.RequiresPermission;
+
/** Sets up a listener to listen for noise level.
*/
class AudioListener {
@@ -20,6 +22,7 @@ class AudioListener {
/** Create a new AudioListener. The caller should call the start() method to start listening.
*/
+ @RequiresPermission(android.Manifest.permission.RECORD_AUDIO)
AudioListener(final AudioListenerCallback cb) {
if( MyDebug.LOG )
Log.d(TAG, "new AudioListener");
diff --git a/app/src/main/java/net/sourceforge/opencamera/HDRProcessor.java b/app/src/main/java/net/sourceforge/opencamera/HDRProcessor.java
index f58882c5d9dfd13fff9a11e0614dd6fee994d584..7dec0c26096beaa991ca6af9d022526ad06b4e39 100644
--- a/app/src/main/java/net/sourceforge/opencamera/HDRProcessor.java
+++ b/app/src/main/java/net/sourceforge/opencamera/HDRProcessor.java
@@ -23,6 +23,9 @@ import android.renderscript.ScriptIntrinsicHistogram;
//import android.renderscript.ScriptIntrinsicResize;
import android.renderscript.Type;
import androidx.annotation.RequiresApi;
+
+import androidx.annotation.NonNull;
+import androidx.annotation.RequiresApi;
import android.util.Log;
public class HDRProcessor {
@@ -39,10 +42,10 @@ public class HDRProcessor {
private ScriptC_process_avg processAvgScript;
private ScriptC_create_mtb createMTBScript;
private ScriptC_align_mtb alignMTBScript;
- /*private ScriptC_histogram_adjust histogramAdjustScript;
- private ScriptC_histogram_compute histogramScript;
- private ScriptC_avg_brighten avgBrightenScript;
- private ScriptC_calculate_sharpness sharpnessScript;*/
+ /*private ScriptC_histogram_adjust histogramAdjustScript;
+ private ScriptC_histogram_compute histogramScript;
+ private ScriptC_avg_brighten avgBrightenScript;
+ private ScriptC_calculate_sharpness sharpnessScript;*/
// public for access by testing
public int [] offsets_x = null;
@@ -79,10 +82,10 @@ public class HDRProcessor {
processAvgScript = null;
createMTBScript = null;
alignMTBScript = null;
- /*histogramAdjustScript = null;
- histogramScript = null;
- avgBrightenScript = null;
- sharpnessScript = null;*/
+ /*histogramAdjustScript = null;
+ histogramScript = null;
+ avgBrightenScript = null;
+ sharpnessScript = null;*/
}
public void onDestroy() {
@@ -258,9 +261,9 @@ public class HDRProcessor {
Log.d(TAG, "parameter_B = " + parameter_B);
}
- if( MyDebug.LOG ) {
+ /*if( MyDebug.LOG ) {
// log samples to a CSV file
- File file = new File(Environment.getExternalStorageDirectory().getPath() + "/net.sourceforge.opencamera.hdr_samples_" + id + ".csv");
+ File file = new File(context.getExternalFilesDir(null).getPath() + "/net.sourceforge.opencamera.hdr_samples_" + id + ".csv");
if( file.exists() ) {
if( !file.delete() ) {
// keep FindBugs happy by checking return argument
@@ -296,7 +299,7 @@ public class HDRProcessor {
}
}
MediaScannerConnection.scanFile(context, new String[] { file.getAbsolutePath() }, null, null);
- }
+ }*/
}
}
@@ -332,7 +335,7 @@ public class HDRProcessor {
* this controls the level of the local contrast enhancement done in adjustHistogram().
* @param n_tiles A value of 1 or greater indicating how local the contrast enhancement algorithm should be.
* @param ce_preserve_blacks
- * If true (recommended), then we apply a modification to the contrast enhancement algorithm to avoid
+ * If true (recommended), then we apply a modification to the contrast enhancement algorithm to avoid
* making darker pixels too dark. A value of false gives more contrast on the darker regions of the
* resultant image.
* @param tonemapping_algorithm
@@ -417,8 +420,8 @@ public class HDRProcessor {
for(int x=0;x= in_bitmap.getWidth() || y_coord + offset_y < 0 || y_coord + offset_y >= in_bitmap.getHeight() ) {
continue;
}
@@ -534,10 +537,10 @@ public class HDRProcessor {
ResponseFunction [] response_functions = new ResponseFunction[n_bitmaps]; // ResponseFunction for each image (the ResponseFunction entry can be left null to indicate the Identity)
offsets_x = new int[n_bitmaps];
offsets_y = new int[n_bitmaps];
- /*int [][] buffers = new int[n_bitmaps][];
- for(int i=0;i= 1600
+ // far more often, even for non-dark scenes. Potentially we could drop the requirement for
+ // "iso >= ISO_FOR_DARK" and instead have iso*exposure_time >= 91 to 115, but we need the
+ // dedicated iso check for Nexus 6 (iso 1196 exposure time 1/12s should be dark) and
+ // Nokia 8 testAvg23 (iso 1044 exposure time 0.1s shouldn't be dark).
+ // We also assume dark for long exposure times (which in practice is probably set in
+ // manual mode) - since long exposure times will give lower ISOs (e.g., on Galaxy S10e)
+ // (also useful for cameras where max ISO isn't as high as ISO_FOR_DARK)
+ //return iso >= ISO_FOR_DARK;
+ return ( iso >= ISO_FOR_DARK && iso*exposure_time >= 69*1000000000L ) || exposure_time >= (1000000000L/5-10000L);
+ }
+
private int cached_avg_sample_size = 1;
/** As part of the noise reduction process, the caller should scale the input images down by the factor returned
* by this method. This both provides a spatial smoothing, as well as improving performance and memory usage.
*/
- public int getAvgSampleSize(int capture_result_iso) {
+ public int getAvgSampleSize(int capture_result_iso, long capture_result_exposure_tim) {
// If changing this, may also want to change the radius of the spatial filter in avg_brighten.rs ?
//this.cached_avg_sample_size = (n_images>=8) ? 2 : 1;
- this.cached_avg_sample_size = (capture_result_iso >= 1100) ? 2 : 1;
+ this.cached_avg_sample_size = sceneIsLowLight(capture_result_iso, capture_result_exposure_tim) ? 2 : 1;
//this.cached_avg_sample_size = 1;
//this.cached_avg_sample_size = 2;
if( MyDebug.LOG )
@@ -1105,11 +1134,15 @@ public class HDRProcessor {
public Allocation allocation_out;
Bitmap bitmap_avg_align;
Allocation allocation_avg_align;
+ Bitmap bitmap_orig; // first bitmap, need to keep until all images are processed, due to being used for allocation_orig
+ Allocation allocation_orig; // saved version of the first allocation
- AvgData(Allocation allocation_out, Bitmap bitmap_avg_align, Allocation allocation_avg_align) {
+ AvgData(Allocation allocation_out, Bitmap bitmap_avg_align, Allocation allocation_avg_align, Bitmap bitmap_orig, Allocation allocation_orig) {
this.allocation_out = allocation_out;
this.bitmap_avg_align = bitmap_avg_align;
this.allocation_avg_align = allocation_avg_align;
+ this.bitmap_orig = bitmap_orig;
+ this.allocation_orig = allocation_orig;
}
public void destroy() {
@@ -1127,6 +1160,14 @@ public class HDRProcessor {
allocation_avg_align.destroy();
allocation_avg_align = null;
}
+ if( bitmap_orig != null ) {
+ bitmap_orig.recycle();
+ bitmap_orig = null;
+ }
+ if( allocation_orig != null ) {
+ allocation_orig.destroy();
+ allocation_orig = null;
+ }
}
}
@@ -1142,10 +1183,11 @@ public class HDRProcessor {
* @param bitmap_new The other input image. The bitmap is recycled.
* @param avg_factor The weighting factor for bitmap_avg.
* @param iso The ISO used to take the photos.
+ * @param exposure_time The exposure time used to take the photos.
* @param zoom_factor The digital zoom factor used to take the photos.
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
- public AvgData processAvg(Bitmap bitmap_avg, Bitmap bitmap_new, float avg_factor, int iso, float zoom_factor) throws HDRProcessorException {
+ public AvgData processAvg(Bitmap bitmap_avg, Bitmap bitmap_new, float avg_factor, int iso, long exposure_time, float zoom_factor) throws HDRProcessorException {
if( MyDebug.LOG ) {
Log.d(TAG, "processAvg");
Log.d(TAG, "avg_factor: " + avg_factor);
@@ -1167,40 +1209,40 @@ public class HDRProcessor {
if( MyDebug.LOG )
Log.d(TAG, "### time after creating renderscript: " + (System.currentTimeMillis() - time_s));
// create allocations
- /*Allocation allocation_avg = Allocation.createFromBitmap(rs, bitmap_avg);
- //Allocation allocation_new = Allocation.createFromBitmap(rs, bitmap_new);
- //Allocation allocation_out = Allocation.createTyped(rs, Type.createXY(rs, Element.F32_3(rs), width, height));
- if( MyDebug.LOG )
- Log.d(TAG, "### time after creating allocations from bitmaps: " + (System.currentTimeMillis() - time_s));
- */
-
- /*final boolean use_sharpness_test = false; // disabled for now - takes about 1s extra, and no evidence this helps quality
- if( use_sharpness_test ) {
- float sharpness_avg = computeSharpness(allocation_avg, width, time_s);
- float sharpness_new = computeSharpness(allocation_new, width, time_s);
- if( sharpness_new > sharpness_avg ) {
- if( MyDebug.LOG )
- Log.d(TAG, "use new image as reference");
- Allocation dummy_allocation = allocation_avg;
- allocation_avg = allocation_new;
- allocation_new = dummy_allocation;
- Bitmap dummy_bitmap = bitmap_avg;
- bitmap_avg = bitmap_new;
- bitmap_new = dummy_bitmap;
- sharp_index = 1;
- }
- else {
- sharp_index = 0;
- }
- if( MyDebug.LOG )
- Log.d(TAG, "sharp_index: " + sharp_index);
- }*/
-
- /*LuminanceInfo luminanceInfo = computeMedianLuminance(bitmap_avg, 0, 0, width, height);
- if( MyDebug.LOG )
- Log.d(TAG, "median: " + luminanceInfo.median_value);*/
-
- AvgData avg_data = processAvgCore(null, null, bitmap_avg, bitmap_new, width, height, avg_factor, iso, zoom_factor, null, null, time_s);
+ /*Allocation allocation_avg = Allocation.createFromBitmap(rs, bitmap_avg);
+ //Allocation allocation_new = Allocation.createFromBitmap(rs, bitmap_new);
+ //Allocation allocation_out = Allocation.createTyped(rs, Type.createXY(rs, Element.F32_3(rs), width, height));
+ if( MyDebug.LOG )
+ Log.d(TAG, "### time after creating allocations from bitmaps: " + (System.currentTimeMillis() - time_s));
+ */
+
+ /*final boolean use_sharpness_test = false; // disabled for now - takes about 1s extra, and no evidence this helps quality
+ if( use_sharpness_test ) {
+ float sharpness_avg = computeSharpness(allocation_avg, width, time_s);
+ float sharpness_new = computeSharpness(allocation_new, width, time_s);
+ if( sharpness_new > sharpness_avg ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "use new image as reference");
+ Allocation dummy_allocation = allocation_avg;
+ allocation_avg = allocation_new;
+ allocation_new = dummy_allocation;
+ Bitmap dummy_bitmap = bitmap_avg;
+ bitmap_avg = bitmap_new;
+ bitmap_new = dummy_bitmap;
+ sharp_index = 1;
+ }
+ else {
+ sharp_index = 0;
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "sharp_index: " + sharp_index);
+ }*/
+
+ /*LuminanceInfo luminanceInfo = computeMedianLuminance(bitmap_avg, 0, 0, width, height);
+ if( MyDebug.LOG )
+ Log.d(TAG, "median: " + luminanceInfo.median_value);*/
+
+ AvgData avg_data = processAvgCore(null, null, bitmap_avg, bitmap_new, width, height, avg_factor, iso, exposure_time, zoom_factor, null, null, null, time_s);
//allocation_avg.copyTo(bitmap_avg);
@@ -1217,10 +1259,11 @@ public class HDRProcessor {
* @param bitmap_new The new input image. The bitmap is recycled.
* @param avg_factor The weighting factor for bitmap_avg.
* @param iso The ISO used to take the photos.
+ * @param exposure_time The exposure time used to take the photos.
* @param zoom_factor The digital zoom factor used to take the photos.
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
- public void updateAvg(AvgData avg_data, int width, int height, Bitmap bitmap_new, float avg_factor, int iso, float zoom_factor) throws HDRProcessorException {
+ public void updateAvg(AvgData avg_data, int width, int height, Bitmap bitmap_new, float avg_factor, int iso, long exposure_time, float zoom_factor) throws HDRProcessorException {
if( MyDebug.LOG ) {
Log.d(TAG, "updateAvg");
Log.d(TAG, "avg_factor: " + avg_factor);
@@ -1236,11 +1279,11 @@ public class HDRProcessor {
long time_s = System.currentTimeMillis();
// create allocations
- /*Allocation allocation_new = Allocation.createFromBitmap(rs, bitmap_new);
- if( MyDebug.LOG )
- Log.d(TAG, "### time after creating allocations from bitmaps: " + (System.currentTimeMillis() - time_s));*/
+ /*Allocation allocation_new = Allocation.createFromBitmap(rs, bitmap_new);
+ if( MyDebug.LOG )
+ Log.d(TAG, "### time after creating allocations from bitmaps: " + (System.currentTimeMillis() - time_s));*/
- processAvgCore(avg_data.allocation_out, avg_data.allocation_out, null, bitmap_new, width, height, avg_factor, iso, zoom_factor, avg_data.allocation_avg_align, avg_data.bitmap_avg_align, time_s);
+ processAvgCore(avg_data.allocation_out, avg_data.allocation_out, null, bitmap_new, width, height, avg_factor, iso, exposure_time, zoom_factor, avg_data.allocation_avg_align, avg_data.bitmap_avg_align, avg_data.allocation_orig, time_s);
if( MyDebug.LOG )
Log.d(TAG, "### time for updateAvg: " + (System.currentTimeMillis() - time_s));
@@ -1251,8 +1294,8 @@ public class HDRProcessor {
* new one will be created.
* @param allocation_avg If non-null, an allocation for the averaged image so far. If null, the
* first bitmap should be supplied as bitmap_avg.
- * @param bitmap_avg If non-null, the first bitmap (which will be recycled). If null, an
- * allocation_avg should be supplied.
+ * @param bitmap_avg If non-null, the first bitmap (which will be recycled when the returned
+ * AvgData is destroyed). If null, an allocation_avg should be supplied.
* @param bitmap_new The new bitmap to combined. The bitmap will be recycled.
* @param width The width of the bitmaps.
* @param height The height of the bitmaps.
@@ -1262,10 +1305,12 @@ public class HDRProcessor {
* @param allocation_avg_align If non-null, use this allocation for alignment for averaged image.
* @param bitmap_avg_align Should be supplied if allocation_avg_align is non-null, and stores
* the bitmap corresponding to the allocation_avg_align.
+ * @param allocation_orig
+ * If non-null, this is an allocation representing the first image.
* @param time_s Time, for debugging.
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
- private AvgData processAvgCore(Allocation allocation_out, Allocation allocation_avg, Bitmap bitmap_avg, Bitmap bitmap_new, int width, int height, float avg_factor, int iso, float zoom_factor, Allocation allocation_avg_align, Bitmap bitmap_avg_align, long time_s) {
+ private AvgData processAvgCore(Allocation allocation_out, Allocation allocation_avg, Bitmap bitmap_avg, Bitmap bitmap_new, int width, int height, float avg_factor, int iso, long exposure_time, float zoom_factor, Allocation allocation_avg_align, Bitmap bitmap_avg_align, Allocation allocation_orig, long time_s) {
if( MyDebug.LOG ) {
Log.d(TAG, "processAvgCore");
Log.d(TAG, "iso: " + iso);
@@ -1299,7 +1344,7 @@ public class HDRProcessor {
//final int scale_align_size = Math.max(4 / this.cached_avg_sample_size, 1);
final int scale_align_size = (zoom_factor > 3.9f) ?
1 :
- Math.max(4 / this.getAvgSampleSize(iso), 1);
+ Math.max(4 / this.getAvgSampleSize(iso, exposure_time), 1);
if( MyDebug.LOG )
Log.d(TAG, "scale_align_size: " + scale_align_size);
boolean crop_to_centre = true;
@@ -1367,25 +1412,25 @@ public class HDRProcessor {
// misalignment more likely in "dark" images with more images and/or longer exposures
// using max_align_scale=2 needed to prevent misalignment in testAvg51; also helps testAvg14
- boolean wider = iso >= 1100;
+ boolean wider = sceneIsLowLight(iso, exposure_time);
autoAlignment(offsets_x, offsets_y, allocations, alignment_width, alignment_height, align_bitmaps, 0, true, null, false, floating_point_align, 1, crop_to_centre, wider ? 2 : 1, full_alignment_width, full_alignment_height, time_s);
- /*
- // compute allocation_diffs
- // if enabling this, should also:
- // - set full_align above to true
- // - set filter_align above to true
- if( processAvgScript == null ) {
- processAvgScript = new ScriptC_process_avg(rs);
- }
- processAvgScript.set_bitmap_align_new(allocations[1]);
- processAvgScript.set_offset_x_new(offsets_x[1]);
- processAvgScript.set_offset_y_new(offsets_y[1]);
- allocation_diffs = Allocation.createTyped(rs, Type.createXY(rs, Element.F32(rs), alignment_width, alignment_height));
- processAvgScript.forEach_compute_diff(allocations[0], allocation_diffs);
- processAvgScript.set_scale_align_size(scale_align_size);
- processAvgScript.set_allocation_diffs(allocation_diffs);
- */
+ /*
+ // compute allocation_diffs
+ // if enabling this, should also:
+ // - set full_align above to true
+ // - set filter_align above to true
+ if( processAvgScript == null ) {
+ processAvgScript = new ScriptC_process_avg(rs);
+ }
+ processAvgScript.set_bitmap_align_new(allocations[1]);
+ processAvgScript.set_offset_x_new(offsets_x[1]);
+ processAvgScript.set_offset_y_new(offsets_y[1]);
+ allocation_diffs = Allocation.createTyped(rs, Type.createXY(rs, Element.F32(rs), alignment_width, alignment_height));
+ processAvgScript.forEach_compute_diff(allocations[0], allocation_diffs);
+ processAvgScript.set_scale_align_size(scale_align_size);
+ processAvgScript.set_allocation_diffs(allocation_diffs);
+ */
if( scale_align ) {
for(int i=0;i bitmaps2 = new ArrayList<>();
- bitmaps2.add(bitmaps.get(0));
- bitmaps2.add(bitmap.get(i));
- Allocation [] allocations = new Allocation[2];
- allocations[0] = allocation_avg;
- allocations[1] = allocation_new;
- BrightnessDetails brightnessDetails = autoAlignment(offsets_x, offsets_y, allocations, width, height, bitmaps, 0, true, null, true, time_s);
- int median_brightness = brightnessDetails.median_brightness;
- if( MyDebug.LOG ) {
- Log.d(TAG, "### time after autoAlignment: " + (System.currentTimeMillis() - time_s));
- Log.d(TAG, "median_brightness: " + median_brightness);
- }
- }*/
+ /*for(int i=1;i bitmaps2 = new ArrayList<>();
+ bitmaps2.add(bitmaps.get(0));
+ bitmaps2.add(bitmap.get(i));
+ Allocation [] allocations = new Allocation[2];
+ allocations[0] = allocation_avg;
+ allocations[1] = allocation_new;
+ BrightnessDetails brightnessDetails = autoAlignment(offsets_x, offsets_y, allocations, width, height, bitmaps, 0, true, null, true, time_s);
+ int median_brightness = brightnessDetails.median_brightness;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "### time after autoAlignment: " + (System.currentTimeMillis() - time_s));
+ Log.d(TAG, "median_brightness: " + median_brightness);
+ }
+ }*/
// write new avg image
// create RenderScript
- /*if( processAvgScript == null ) {
- processAvgScript = new ScriptC_process_avg(rs);
- }*/
+ /*if( processAvgScript == null ) {
+ processAvgScript = new ScriptC_process_avg(rs);
+ }*/
ScriptC_process_avg processAvgScript = new ScriptC_process_avg(rs);
// set allocations
@@ -1807,10 +1871,18 @@ public class HDRProcessor {
BitmapInfo bitmapInfo = new BitmapInfo(luminanceInfos[i], bitmaps.get(i), allocations[i], i);
bitmapInfos.add(bitmapInfo);
}
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "before sorting:");
+ for(int i=0;i() {
@Override
public int compare(BitmapInfo o1, BitmapInfo o2) {
- return o1.luminanceInfo.median_value - o2.luminanceInfo.median_value;
+ // important to use the code in LuminanceInfo.compareTo(), as that's also tested via the unit test
+ // sortLuminanceInfo()
+ return o1.luminanceInfo.compareTo(o2.luminanceInfo);
}
});
bitmaps.clear();
@@ -1820,8 +1892,9 @@ public class HDRProcessor {
allocations[i] = bitmapInfos.get(i).allocation;
}
if( MyDebug.LOG ) {
+ Log.d(TAG, "after sorting:");
for(int i=0;i 255-(min_diff_c+1) ) {
+ throw new RuntimeException("image " + i + " has median_value: " + median_value); // test
+ }*/
+ median_value = Math.max(median_value, min_diff_c+1);
+ median_value = Math.min(median_value, 255-(min_diff_c+1));
+ if( MyDebug.LOG )
+ Log.d(TAG, i + ": median_value is now: " + median_value);
+
// set parameters
if( use_mtb )
createMTBScript.set_median_value(median_value);
@@ -1894,31 +1985,31 @@ public class HDRProcessor {
if( MyDebug.LOG )
Log.d(TAG, "time after createMTBScript: " + (System.currentTimeMillis() - time_s));
- /*if( MyDebug.LOG ) {
- // debugging
- byte [] mtb_bytes = new byte[mtb_width*mtb_height];
- mtb_allocations[i].copyTo(mtb_bytes);
- int [] pixels = new int[mtb_width*mtb_height];
- for(int j=0;j {
+ final int min_value;
final int median_value;
+ final int hi_value;
final boolean noisy;
- LuminanceInfo(int median_value, boolean noisy) {
+ public LuminanceInfo(int min_value, int median_value, int hi_value, boolean noisy) {
+ this.min_value = min_value;
this.median_value = median_value;
+ this.hi_value = hi_value;
this.noisy = noisy;
}
+
+ @Override
+ @NonNull
+ public String toString() {
+ return "min: " + min_value + " , median: " + median_value + " , hi: " + hi_value + " , noisy: " + noisy;
+ }
+
+ @Override
+ public int compareTo(LuminanceInfo o) {
+ int value = this.median_value - o.median_value;
+ if( value == 0 ) {
+ // fall back to using min_value
+ value = this.min_value - o.min_value;
+ }
+ if( value == 0 ) {
+ // fall back to using hi_value
+ value = this.hi_value - o.hi_value;
+ }
+ return value;
+ }
}
private LuminanceInfo computeMedianLuminance(Bitmap bitmap, int mtb_x, int mtb_y, int mtb_width, int mtb_height) {
@@ -2184,8 +2299,8 @@ public class HDRProcessor {
double beta = ((double) x + 1.0) / ((double) n_w_samples + 1.0);
//int x_coord = (int) (beta * bitmap.getWidth());
int x_coord = mtb_x + (int) (beta * mtb_width);
- /*if( MyDebug.LOG )
- Log.d(TAG, "sample value from " + x_coord + " , " + y_coord);*/
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "sample value from " + x_coord + " , " + y_coord);*/
int color = bitmap.getPixel(x_coord, y_coord);
int r = (color & 0xFF0000) >> 16;
int g = (color & 0xFF00) >> 8;
@@ -2197,14 +2312,39 @@ public class HDRProcessor {
total++;
}
}
- /*float avg_luminance = (float)(Math.exp( sum_log_luminance / total ));
- if( MyDebug.LOG )
- Log.d(TAG, "avg_luminance: " + avg_luminance);*/
+ /*float avg_luminance = (float)(Math.exp( sum_log_luminance / total ));
+ if( MyDebug.LOG )
+ Log.d(TAG, "avg_luminance: " + avg_luminance);*/
int middle = total/2;
int count = 0;
boolean noisy = false;
+ int min_value = -1, hi_value = -1;
+ // first count backwards to get hi_value
+ for(int i=255;i>=0;i--) {
+ /*if( histo[i] > 0 ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "max luminance " + i);
+ max_value = i;
+ break;
+ }*/
+ count += histo[i];
+ if( count >= total/10 ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "hi luminance " + i);
+ hi_value = i;
+ break;
+ }
+ }
+
+ // then count forwards to get min and median values
+ count = 0;
for(int i=0;i<256;i++) {
count += histo[i];
+ if( min_value == -1 && histo[i] > 0 ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "min luminance " + i);
+ min_value = i;
+ }
if( count >= middle ) {
if( MyDebug.LOG )
Log.d(TAG, "median luminance " + i);
@@ -2232,11 +2372,11 @@ public class HDRProcessor {
Log.d(TAG, "too dark/noisy");
noisy = true;
}
- return new LuminanceInfo(i, noisy);
+ return new LuminanceInfo(min_value, i, hi_value, noisy);
}
}
Log.e(TAG, "computeMedianLuminance failed");
- return new LuminanceInfo(127, true);
+ return new LuminanceInfo(min_value, 127, hi_value, true);
}
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
@@ -2256,50 +2396,50 @@ public class HDRProcessor {
Log.d(TAG, "time after creating histogram: " + (System.currentTimeMillis() - time_s));
histogramAllocation.copyTo(histogram);
- /*if( MyDebug.LOG ) {
- // compare/adjust
- allocations[0].copyTo(bm);
- int [] debug_histogram = new int[256];
- for(int i=0;i<256;i++) {
- debug_histogram[i] = 0;
- }
- int [] debug_buffer = new int[width];
- for(int y=0;y> 16);
- float g = (float)((color & 0xFF00) >> 8);
- float b = (float)(color & 0xFF);
- //float value = 0.299f*r + 0.587f*g + 0.114f*b; // matches ScriptIntrinsicHistogram default behaviour
- float value = Math.max(r, g);
- value = Math.max(value, b);
- int i_value = (int)value;
- i_value = Math.min(255, i_value); // just in case
- debug_histogram[i_value]++;
- }
- }
- for(int x=0;x<256;x++) {
- Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " debug_histogram: " + debug_histogram[x]);
- //histogram[x] = debug_histogram[x];
- }
- }*/
+ /*if( MyDebug.LOG ) {
+ // compare/adjust
+ allocations[0].copyTo(bm);
+ int [] debug_histogram = new int[256];
+ for(int i=0;i<256;i++) {
+ debug_histogram[i] = 0;
+ }
+ int [] debug_buffer = new int[width];
+ for(int y=0;y> 16);
+ float g = (float)((color & 0xFF00) >> 8);
+ float b = (float)(color & 0xFF);
+ //float value = 0.299f*r + 0.587f*g + 0.114f*b; // matches ScriptIntrinsicHistogram default behaviour
+ float value = Math.max(r, g);
+ value = Math.max(value, b);
+ int i_value = (int)value;
+ i_value = Math.min(255, i_value); // just in case
+ debug_histogram[i_value]++;
+ }
+ }
+ for(int x=0;x<256;x++) {
+ Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " debug_histogram: " + debug_histogram[x]);
+ //histogram[x] = debug_histogram[x];
+ }
+ }*/
int [] c_histogram = new int[256];
c_histogram[0] = histogram[0];
for(int x=1;x<256;x++) {
c_histogram[x] = c_histogram[x-1] + histogram[x];
}
- /*if( MyDebug.LOG ) {
- for(int x=0;x<256;x++) {
- Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " cumulative: " + c_histogram[x]);
- }
- }*/
+ /*if( MyDebug.LOG ) {
+ for(int x=0;x<256;x++) {
+ Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " cumulative: " + c_histogram[x]);
+ }
+ }*/
histogramAllocation.copyFrom(c_histogram);
- /*if( histogramAdjustScript == null ) {
- histogramAdjustScript = new ScriptC_histogram_adjust(rs);
- }*/
+ /*if( histogramAdjustScript == null ) {
+ histogramAdjustScript = new ScriptC_histogram_adjust(rs);
+ }*/
ScriptC_histogram_adjust histogramAdjustScript = new ScriptC_histogram_adjust(rs);
histogramAdjustScript.set_c_histogram(histogramAllocation);
histogramAdjustScript.set_hdr_alpha(hdr_alpha);
@@ -2325,11 +2465,11 @@ public class HDRProcessor {
// create histograms
Allocation histogramAllocation = Allocation.createSized(rs, Element.I32(rs), 256);
- /*if( histogramScript == null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "create histogramScript");
- histogramScript = new ScriptC_histogram_compute(rs);
- }*/
+ /*if( histogramScript == null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "create histogramScript");
+ histogramScript = new ScriptC_histogram_compute(rs);
+ }*/
if( MyDebug.LOG )
Log.d(TAG, "create histogramScript");
ScriptC_histogram_compute histogramScript = new ScriptC_histogram_compute(rs);
@@ -2356,14 +2496,14 @@ public class HDRProcessor {
int stop_y = (int)(b1 * height);
if( stop_y == start_y )
continue;
- /*if( MyDebug.LOG )
- Log.d(TAG, i + " , " + j + " : " + start_x + " , " + start_y + " to " + stop_x + " , " + stop_y);*/
+ /*if( MyDebug.LOG )
+ Log.d(TAG, i + " , " + j + " : " + start_x + " , " + start_y + " to " + stop_x + " , " + stop_y);*/
Script.LaunchOptions launch_options = new Script.LaunchOptions();
launch_options.setX(start_x, stop_x);
launch_options.setY(start_y, stop_y);
- /*if( MyDebug.LOG )
- Log.d(TAG, "call histogramScript");*/
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "call histogramScript");*/
histogramScript.invoke_init_histogram();
// We compute a histogram based on the max RGB value, so this matches with the scaling we do in histogram_adjust.rs.
// This improves the look of the grass in testHDR24, testHDR27.
@@ -2372,39 +2512,39 @@ public class HDRProcessor {
int [] histogram = new int[256];
histogramAllocation.copyTo(histogram);
- /*if( MyDebug.LOG ) {
- // compare/adjust
- allocations[0].copyTo(bm);
- int [] debug_histogram = new int[256];
- for(int k=0;k<256;k++) {
- debug_histogram[k] = 0;
- }
- int [] debug_buffer = new int[width];
- for(int y=start_y;y> 16);
- float g = (float)((color & 0xFF00) >> 8);
- float b = (float)(color & 0xFF);
- //float value = 0.299f*r + 0.587f*g + 0.114f*b; // matches ScriptIntrinsicHistogram default behaviour
- float value = Math.max(r, g);
- value = Math.max(value, b);
- int i_value = (int)value;
- i_value = Math.min(255, i_value); // just in case
- debug_histogram[i_value]++;
- }
- }
- for(int x=0;x<256;x++) {
- Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " debug_histogram: " + debug_histogram[x]);
- //histogram[x] = debug_histogram[x];
- }
- }*/
+ /*if( MyDebug.LOG ) {
+ // compare/adjust
+ allocations[0].copyTo(bm);
+ int [] debug_histogram = new int[256];
+ for(int k=0;k<256;k++) {
+ debug_histogram[k] = 0;
+ }
+ int [] debug_buffer = new int[width];
+ for(int y=start_y;y> 16);
+ float g = (float)((color & 0xFF00) >> 8);
+ float b = (float)(color & 0xFF);
+ //float value = 0.299f*r + 0.587f*g + 0.114f*b; // matches ScriptIntrinsicHistogram default behaviour
+ float value = Math.max(r, g);
+ value = Math.max(value, b);
+ int i_value = (int)value;
+ i_value = Math.min(255, i_value); // just in case
+ debug_histogram[i_value]++;
+ }
+ }
+ for(int x=0;x<256;x++) {
+ Log.d(TAG, "histogram[" + x + "] = " + histogram[x] + " debug_histogram: " + debug_histogram[x]);
+ //histogram[x] = debug_histogram[x];
+ }
+ }*/
// clip histogram, for Contrast Limited AHE algorithm
int n_pixels = (stop_x - start_x) * (stop_y - start_y);
int clip_limit = (5 * n_pixels) / 256;
- /*if( MyDebug.LOG ) {
+ /*if( MyDebug.LOG ) {
Log.d(TAG, "clip_limit: " + clip_limit);
Log.d(TAG, " relative clip limit: " + clip_limit*256.0f/n_pixels);
}*/
@@ -2441,10 +2581,10 @@ public class HDRProcessor {
}
}
int n_clipped_per_bucket = n_clipped / 256;
- /*if( MyDebug.LOG ) {
- Log.d(TAG, "n_clipped: " + n_clipped);
- Log.d(TAG, "n_clipped_per_bucket: " + n_clipped_per_bucket);
- }*/
+ /*if( MyDebug.LOG ) {
+ Log.d(TAG, "n_clipped: " + n_clipped);
+ Log.d(TAG, "n_clipped_per_bucket: " + n_clipped_per_bucket);
+ }*/
for(int x=0;x<256;x++) {
histogram[x] += n_clipped_per_bucket;
}
@@ -2494,8 +2634,8 @@ public class HDRProcessor {
if( MyDebug.LOG )
Log.d(TAG, "x: " + x + " ; limit: " + limit);
/*histogram[x] = Math.max(histogram[x], limit);
- if( MyDebug.LOG )
- Log.d(TAG, " histogram pulled up to: " + histogram[x]);*/
+ if( MyDebug.LOG )
+ Log.d(TAG, " histogram pulled up to: " + histogram[x]);*/
if( histogram[x] < limit ) {
// top up by redistributing later values
for(int y=x+1;y<256 && histogram[x] < limit;y++) {
@@ -2508,8 +2648,8 @@ public class HDRProcessor {
}
if( MyDebug.LOG )
Log.d(TAG, " histogram pulled up to: " + histogram[x]);
- /*if( temp_c_histogram[x] >= c_equal_limit )
- throw new RuntimeException(); // test*/
+ /*if( temp_c_histogram[x] >= c_equal_limit )
+ throw new RuntimeException(); // test*/
}
}
}
@@ -2533,9 +2673,9 @@ public class HDRProcessor {
Allocation c_histogramAllocation = Allocation.createSized(rs, Element.I32(rs), n_tiles*n_tiles*256);
c_histogramAllocation.copyFrom(c_histogram);
- /*if( histogramAdjustScript == null ) {
- histogramAdjustScript = new ScriptC_histogram_adjust(rs);
- }*/
+ /*if( histogramAdjustScript == null ) {
+ histogramAdjustScript = new ScriptC_histogram_adjust(rs);
+ }*/
ScriptC_histogram_adjust histogramAdjustScript = new ScriptC_histogram_adjust(rs);
histogramAdjustScript.set_c_histogram(c_histogramAllocation);
histogramAdjustScript.set_hdr_alpha(hdr_alpha);
@@ -2568,11 +2708,11 @@ public class HDRProcessor {
//final boolean use_custom_histogram = false;
final boolean use_custom_histogram = true;
if( use_custom_histogram ) {
- /*if( histogramScript == null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "create histogramScript");
- histogramScript = new ScriptC_histogram_compute(rs);
- }*/
+ /*if( histogramScript == null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "create histogramScript");
+ histogramScript = new ScriptC_histogram_compute(rs);
+ }*/
if( MyDebug.LOG )
Log.d(TAG, "create histogramScript");
ScriptC_histogram_compute histogramScript = new ScriptC_histogram_compute(rs);
@@ -2681,6 +2821,26 @@ public class HDRProcessor {
}
private static int getBrightnessTarget(int brightness, float max_gain_factor, int ideal_brightness) {
+ if( brightness > 0 ) {
+ // At least try to achieve a minimum brightness.
+ // Increasing max_gain_factor helps the following tests significantly: testAvg12, testAvg14, testAvg15,
+ // testAvg28, testAvg31, testAvg32.
+ // Other tests also helped to a lesser degree are: testAvg1, testAvg5, testAvg6, testAvg40, testAvg41,
+ // testAvg42, testHDR1, testHDR1_exp5, testHDR11 (DRO example), testHDR20 (DRO example), testHDR28 (DRO example),
+ // testHDR48, testHDR49, testHDR49_exp5, testHDR53.
+ // We need to be careful of increasing max_gain_factor too high in some cases - for AvgTests, see comment in
+ // computeBrightenFactors() for examples of tests that would be affected.
+
+ final float min_brightness_c = 42.0f;
+ float min_max_gain_factor = min_brightness_c / brightness;
+ max_gain_factor = Math.max(max_gain_factor, min_max_gain_factor);
+
+ // still set some maximum max_gain_factor - highest max_gain_factor in tests is
+ // testAvg14 with max_gain_factor=14.0, which benefits from this, but some parts starting
+ // to look overblown
+ max_gain_factor = Math.min(max_gain_factor, 15.0f);
+ }
+
if( brightness <= 0 )
brightness = 1;
if( MyDebug.LOG ) {
@@ -2709,11 +2869,13 @@ public class HDRProcessor {
/** Computes various factors used in the avg_brighten.rs script.
*/
public static BrightenFactors computeBrightenFactors(boolean has_iso_exposure, int iso, long exposure_time, int brightness, int max_brightness) {
- // for outdoor/bright images, don't want max_gain_factor 4, otherwise we lose variation in grass colour in testAvg42
+ // For outdoor/bright images, don't want max_gain_factor 4, otherwise we lose variation in grass colour in testAvg42
// and having max_gain_factor at 1.5 prevents testAvg43, testAvg44 being too bright and oversaturated
// for other images, we also don't want max_gain_factor 4, as makes cases too bright and overblown if it would
// take the max_possible_value over 255. Especially testAvg46, but also testAvg25, testAvg31, testAvg38,
- // testAvg39
+ // testAvg39.
+ // Note however that we now do allow increasing the max_gain_factor in getBrightnessTarget(), depending on
+ // brightness levels.
float max_gain_factor = 1.5f;
int ideal_brightness = 119;
if( has_iso_exposure && iso < 1100 && exposure_time < 1000000000L/59 ) {
@@ -2724,6 +2886,9 @@ public class HDRProcessor {
int brightness_target = getBrightnessTarget(brightness, max_gain_factor, ideal_brightness);
//int max_target = Math.min(255, (int)((max_brightness*brightness_target)/(float)brightness + 0.5f) );
if( MyDebug.LOG ) {
+ Log.d(TAG, "brightness: " + brightness);
+ Log.d(TAG, "max_brightness: " + max_brightness);
+ Log.d(TAG, "ideal_brightness: " + ideal_brightness);
Log.d(TAG, "brightness target: " + brightness_target);
//Log.d(TAG, "max target: " + max_target);
}
@@ -2753,35 +2918,35 @@ public class HDRProcessor {
if( MyDebug.LOG )
Log.d(TAG, "max_possible_value: " + max_possible_value);
- /*if( max_possible_value > 255.0f ) {
- gain = 255.0f / max_brightness;
- if( MyDebug.LOG )
- Log.d(TAG, "limit gain to: " + gain);
- // use gamma correction for the remainder
- if( brightness_target > gain * brightness ) {
- gamma = (float) (Math.log(brightness_target / 255.0f) / Math.log(gain * brightness / 255.0f));
- }
- }
-
- //float gamma = (float)(Math.log(brightness_target/255.0f) / Math.log(brightness/255.0f));
- if( MyDebug.LOG )
- Log.d(TAG, "gamma " + gamma);
- final float min_gamma_non_bright_c = 0.75f;
- //final float min_gamma_non_bright_c = 0.5f;
- if( gamma > 1.0f ) {
- gamma = 1.0f;
- if( MyDebug.LOG ) {
- Log.d(TAG, "clamped gamma to : " + gamma);
- }
- }
- else if( has_iso_exposure && iso > 150 && gamma < min_gamma_non_bright_c ) {
- // too small gamma on non-bright reduces contrast too much (e.g., see testAvg9)
- // however we can't clamp too much, see testAvg28, testAvg32
- gamma = min_gamma_non_bright_c;
- if( MyDebug.LOG ) {
- Log.d(TAG, "clamped gamma to : " + gamma);
- }
- }*/
+ /*if( max_possible_value > 255.0f ) {
+ gain = 255.0f / max_brightness;
+ if( MyDebug.LOG )
+ Log.d(TAG, "limit gain to: " + gain);
+ // use gamma correction for the remainder
+ if( brightness_target > gain * brightness ) {
+ gamma = (float) (Math.log(brightness_target / 255.0f) / Math.log(gain * brightness / 255.0f));
+ }
+ }
+
+ //float gamma = (float)(Math.log(brightness_target/255.0f) / Math.log(brightness/255.0f));
+ if( MyDebug.LOG )
+ Log.d(TAG, "gamma " + gamma);
+ final float min_gamma_non_bright_c = 0.75f;
+ //final float min_gamma_non_bright_c = 0.5f;
+ if( gamma > 1.0f ) {
+ gamma = 1.0f;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "clamped gamma to : " + gamma);
+ }
+ }
+ else if( has_iso_exposure && iso > 150 && gamma < min_gamma_non_bright_c ) {
+ // too small gamma on non-bright reduces contrast too much (e.g., see testAvg9)
+ // however we can't clamp too much, see testAvg28, testAvg32
+ gamma = min_gamma_non_bright_c;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "clamped gamma to : " + gamma);
+ }
+ }*/
float mid_x = 255.5f;
if( max_possible_value > 255.0f ) {
@@ -2867,27 +3032,27 @@ public class HDRProcessor {
float gamma = brighten_factors.gamma;
//float gain = brightness_target / (float)brightness;
- /*float gamma = (float)(Math.log(max_target/(float)brightness_target) / Math.log(max_brightness/(float)brightness));
- float gain = brightness_target / ((float)Math.pow(brightness/255.0f, gamma) * 255.0f);
- if( MyDebug.LOG ) {
- Log.d(TAG, "gamma " + gamma);
- Log.d(TAG, "gain " + gain);
- Log.d(TAG, "gain2 " + max_target / ((float)Math.pow(max_brightness/255.0f, gamma) * 255.0f));
- }*/
- /*float gain = brightness_target / (float)brightness;
- if( MyDebug.LOG ) {
- Log.d(TAG, "gain: " + gain);
- }
- if( gain < 1.0f ) {
- gain = 1.0f;
- if( MyDebug.LOG ) {
- Log.d(TAG, "clamped gain to : " + gain);
- }
- }*/
-
- /*if( avgBrightenScript == null ) {
- avgBrightenScript = new ScriptC_avg_brighten(rs);
- }*/
+ /*float gamma = (float)(Math.log(max_target/(float)brightness_target) / Math.log(max_brightness/(float)brightness));
+ float gain = brightness_target / ((float)Math.pow(brightness/255.0f, gamma) * 255.0f);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "gamma " + gamma);
+ Log.d(TAG, "gain " + gain);
+ Log.d(TAG, "gain2 " + max_target / ((float)Math.pow(max_brightness/255.0f, gamma) * 255.0f));
+ }*/
+ /*float gain = brightness_target / (float)brightness;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "gain: " + gain);
+ }
+ if( gain < 1.0f ) {
+ gain = 1.0f;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "clamped gain to : " + gain);
+ }
+ }*/
+
+ /*if( avgBrightenScript == null ) {
+ avgBrightenScript = new ScriptC_avg_brighten(rs);
+ }*/
ScriptC_avg_brighten avgBrightenScript = new ScriptC_avg_brighten(rs);
avgBrightenScript.set_bitmap(input);
float black_level = 0.0f;
@@ -2923,47 +3088,47 @@ public class HDRProcessor {
avgBrightenScript.set_median_filter_strength(median_filter_strength);
avgBrightenScript.invoke_setBrightenParameters(gain, gamma, low_x, mid_x, max_brightness);
- /*float tonemap_scale_c = 255.0f;
- if( MyDebug.LOG )
- Log.d(TAG, "tonemap_scale_c: " + tonemap_scale_c);
- avgBrightenScript.set_tonemap_scale(tonemap_scale_c);
-
- float max_possible_value = gain*max_brightness;
- if( MyDebug.LOG )
- Log.d(TAG, "max_possible_value: " + max_possible_value);
- if( max_possible_value < 255.0f ) {
- max_possible_value = 255.0f; // don't make dark images too bright
- if( MyDebug.LOG )
- Log.d(TAG, "clamp max_possible_value to: " + max_possible_value);
- }
- float linear_scale = (max_possible_value + tonemap_scale_c) / max_possible_value;
- if( MyDebug.LOG )
- Log.d(TAG, "linear_scale: " + linear_scale);
- avgBrightenScript.set_linear_scale(linear_scale);*/
-
- /*{
- max_possible_value = max_brightness;
- float tonemap_scale_c = 255.0f;
- if( 255.0f / max_possible_value < ((float)brightness_target)/(float)brightness + brightness_target / 255.0f - 1.0f ) {
- final float tonemap_denom = ((float)brightness_target)/(float)brightness - (255.0f / max_possible_value);
- if( MyDebug.LOG )
- Log.d(TAG, "tonemap_denom: " + tonemap_denom);
- if( tonemap_denom != 0.0f ) // just in case
- tonemap_scale_c = (255.0f - brightness_target) / tonemap_denom;
- //throw new RuntimeException(); // test
- }
- // Higher tonemap_scale_c values means darker results from the Reinhard tonemapping.
- // Colours brighter than 255-tonemap_scale_c will be made darker, colours darker than 255-tonemap_scale_c will be made brighter
- // (tonemap_scale_c==255 means therefore that colours will only be made darker).
- if( MyDebug.LOG )
- Log.d(TAG, "tonemap_scale_c: " + tonemap_scale_c);
- avgBrightenScript.set_tonemap_scale(tonemap_scale_c);
-
- float linear_scale = (max_possible_value + tonemap_scale_c) / max_possible_value;
- if( MyDebug.LOG )
- Log.d(TAG, "linear_scale: " + linear_scale);
- avgBrightenScript.set_linear_scale(linear_scale);
- }*/
+ /*float tonemap_scale_c = 255.0f;
+ if( MyDebug.LOG )
+ Log.d(TAG, "tonemap_scale_c: " + tonemap_scale_c);
+ avgBrightenScript.set_tonemap_scale(tonemap_scale_c);
+
+ float max_possible_value = gain*max_brightness;
+ if( MyDebug.LOG )
+ Log.d(TAG, "max_possible_value: " + max_possible_value);
+ if( max_possible_value < 255.0f ) {
+ max_possible_value = 255.0f; // don't make dark images too bright
+ if( MyDebug.LOG )
+ Log.d(TAG, "clamp max_possible_value to: " + max_possible_value);
+ }
+ float linear_scale = (max_possible_value + tonemap_scale_c) / max_possible_value;
+ if( MyDebug.LOG )
+ Log.d(TAG, "linear_scale: " + linear_scale);
+ avgBrightenScript.set_linear_scale(linear_scale);*/
+
+ /*{
+ max_possible_value = max_brightness;
+ float tonemap_scale_c = 255.0f;
+ if( 255.0f / max_possible_value < ((float)brightness_target)/(float)brightness + brightness_target / 255.0f - 1.0f ) {
+ final float tonemap_denom = ((float)brightness_target)/(float)brightness - (255.0f / max_possible_value);
+ if( MyDebug.LOG )
+ Log.d(TAG, "tonemap_denom: " + tonemap_denom);
+ if( tonemap_denom != 0.0f ) // just in case
+ tonemap_scale_c = (255.0f - brightness_target) / tonemap_denom;
+ //throw new RuntimeException(); // test
+ }
+ // Higher tonemap_scale_c values means darker results from the Reinhard tonemapping.
+ // Colours brighter than 255-tonemap_scale_c will be made darker, colours darker than 255-tonemap_scale_c will be made brighter
+ // (tonemap_scale_c==255 means therefore that colours will only be made darker).
+ if( MyDebug.LOG )
+ Log.d(TAG, "tonemap_scale_c: " + tonemap_scale_c);
+ avgBrightenScript.set_tonemap_scale(tonemap_scale_c);
+
+ float linear_scale = (max_possible_value + tonemap_scale_c) / max_possible_value;
+ if( MyDebug.LOG )
+ Log.d(TAG, "linear_scale: " + linear_scale);
+ avgBrightenScript.set_linear_scale(linear_scale);
+ }*/
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Allocation allocation_out = Allocation.createFromBitmap(rs, bitmap);
@@ -3006,16 +3171,16 @@ public class HDRProcessor {
if( MyDebug.LOG )
Log.d(TAG, "### time after copying to bitmap: " + (System.currentTimeMillis() - time_s));
- /*int sample_size = getAvgSampleSize();
- if( MyDebug.LOG )
- Log.d(TAG, "sample_size: " + sample_size);
- if( sample_size > 1 ) {
- Matrix matrix = new Matrix();
- matrix.postScale(sample_size, sample_size);
- Bitmap new_bitmap = Bitmap.createBitmap(bitmap, 0, 0, width, height, matrix, true);
- bitmap.recycle();
- bitmap = new_bitmap;
- }*/
+ /*int sample_size = getAvgSampleSize();
+ if( MyDebug.LOG )
+ Log.d(TAG, "sample_size: " + sample_size);
+ if( sample_size > 1 ) {
+ Matrix matrix = new Matrix();
+ matrix.postScale(sample_size, sample_size);
+ Bitmap new_bitmap = Bitmap.createBitmap(bitmap, 0, 0, width, height, matrix, true);
+ bitmap.recycle();
+ bitmap = new_bitmap;
+ }*/
freeScripts();
if( MyDebug.LOG )
@@ -3029,7 +3194,6 @@ public class HDRProcessor {
* @param allocation_in The input allocation.
* @param width The width of the allocation.
*/
- @SuppressWarnings("unused")
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
private float computeSharpness(Allocation allocation_in, int width, long time_s) {
if( MyDebug.LOG )
@@ -3039,11 +3203,11 @@ public class HDRProcessor {
Allocation sumsAllocation = Allocation.createSized(rs, Element.I32(rs), width);
if( MyDebug.LOG )
Log.d(TAG, "### time after createSized: " + (System.currentTimeMillis() - time_s));
- /*if( sharpnessScript == null ) {
- sharpnessScript = new ScriptC_calculate_sharpness(rs);
- if( MyDebug.LOG )
- Log.d(TAG, "### time after create sharpnessScript: " + (System.currentTimeMillis() - time_s));
- }*/
+ /*if( sharpnessScript == null ) {
+ sharpnessScript = new ScriptC_calculate_sharpness(rs);
+ if( MyDebug.LOG )
+ Log.d(TAG, "### time after create sharpnessScript: " + (System.currentTimeMillis() - time_s));
+ }*/
ScriptC_calculate_sharpness sharpnessScript = new ScriptC_calculate_sharpness(rs);
if( MyDebug.LOG )
Log.d(TAG, "### time after create sharpnessScript: " + (System.currentTimeMillis() - time_s));
@@ -3066,8 +3230,8 @@ public class HDRProcessor {
sumsAllocation.destroy();
float total_sum = 0.0f;
for(int i=0;i jpeg_images,
RawImage raw_image,
boolean image_capture_intent, Uri image_capture_intent_uri,
- boolean using_camera2,
+ boolean using_camera2, boolean using_camera_extensions,
ImageFormat image_format, int image_quality,
boolean do_auto_stabilise, double level_angle, List gyro_rotation_matrix,
boolean is_front_facing,
@@ -191,7 +197,9 @@ public class ImageSaver extends Thread {
int iso,
long exposure_time,
float zoom_factor,
- String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat, String preference_stamp_geo_address, String preference_units_distance,
+ String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat,
+ //String preference_stamp_geo_address,
+ String preference_units_distance,
boolean panorama_crop,
boolean store_location, Location location, boolean store_geo_direction, double geo_direction,
double pitch_angle, boolean store_ypr,
@@ -208,6 +216,7 @@ public class ImageSaver extends Thread {
this.image_capture_intent = image_capture_intent;
this.image_capture_intent_uri = image_capture_intent_uri;
this.using_camera2 = using_camera2;
+ this.using_camera_extensions = using_camera_extensions;
this.image_format = image_format;
this.image_quality = image_quality;
this.do_auto_stabilise = do_auto_stabilise;
@@ -228,7 +237,7 @@ public class ImageSaver extends Thread {
this.preference_stamp_dateformat = preference_stamp_dateformat;
this.preference_stamp_timeformat = preference_stamp_timeformat;
this.preference_stamp_gpsformat = preference_stamp_gpsformat;
- this.preference_stamp_geo_address = preference_stamp_geo_address;
+ //this.preference_stamp_geo_address = preference_stamp_geo_address;
this.preference_units_distance = preference_units_distance;
this.panorama_crop = panorama_crop;
this.store_location = store_location;
@@ -254,7 +263,7 @@ public class ImageSaver extends Thread {
this.jpeg_images,
this.raw_image,
this.image_capture_intent, this.image_capture_intent_uri,
- this.using_camera2,
+ this.using_camera2, this.using_camera_extensions,
this.image_format, this.image_quality,
this.do_auto_stabilise, this.level_angle, this.gyro_rotation_matrix,
this.is_front_facing,
@@ -264,7 +273,9 @@ public class ImageSaver extends Thread {
this.iso,
this.exposure_time,
this.zoom_factor,
- this.preference_stamp, this.preference_textstamp, this.font_size, this.color, this.pref_style, this.preference_stamp_dateformat, this.preference_stamp_timeformat, this.preference_stamp_gpsformat, this.preference_stamp_geo_address, this.preference_units_distance,
+ this.preference_stamp, this.preference_textstamp, this.font_size, this.color, this.pref_style, this.preference_stamp_dateformat, this.preference_stamp_timeformat, this.preference_stamp_gpsformat,
+ //this.preference_stamp_geo_address,
+ this.preference_units_distance,
this.panorama_crop, this.store_location, this.location, this.store_geo_direction, this.geo_direction,
this.pitch_angle, this.store_ypr,
this.custom_tag_artist,
@@ -445,6 +456,22 @@ public class ImageSaver extends Thread {
return n_real_images_to_save;
}
+ /** Application has paused.
+ */
+ void onPause() {
+ synchronized(this) {
+ app_is_paused = true;
+ }
+ }
+
+ /** Application has resumed.
+ */
+ void onResume() {
+ synchronized(this) {
+ app_is_paused = false;
+ }
+ }
+
void onDestroy() {
if( MyDebug.LOG )
Log.d(TAG, "onDestroy");
@@ -493,6 +520,8 @@ public class ImageSaver extends Thread {
break;
}
if( test_slow_saving ) {
+ // ignore warning about "Call to Thread.sleep in a loop", this is only activated in test code
+ //noinspection BusyWait
Thread.sleep(2000);
}
if( MyDebug.LOG ) {
@@ -545,7 +574,7 @@ public class ImageSaver extends Thread {
boolean save_expo,
List images,
boolean image_capture_intent, Uri image_capture_intent_uri,
- boolean using_camera2,
+ boolean using_camera2, boolean using_camera_extensions,
Request.ImageFormat image_format, int image_quality,
boolean do_auto_stabilise, double level_angle,
boolean is_front_facing,
@@ -555,7 +584,9 @@ public class ImageSaver extends Thread {
int iso,
long exposure_time,
float zoom_factor,
- String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat, String preference_stamp_geo_address, String preference_units_distance,
+ String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat,
+ //String preference_stamp_geo_address,
+ String preference_units_distance,
boolean panorama_crop,
boolean store_location, Location location, boolean store_geo_direction, double geo_direction,
double pitch_angle, boolean store_ypr,
@@ -576,7 +607,7 @@ public class ImageSaver extends Thread {
images,
null,
image_capture_intent, image_capture_intent_uri,
- using_camera2,
+ using_camera2, using_camera_extensions,
image_format, image_quality,
do_auto_stabilise, level_angle,
is_front_facing,
@@ -586,7 +617,9 @@ public class ImageSaver extends Thread {
iso,
exposure_time,
zoom_factor,
- preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat, preference_stamp_geo_address, preference_units_distance,
+ preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat,
+ //preference_stamp_geo_address,
+ preference_units_distance,
panorama_crop, store_location, location, store_geo_direction, geo_direction,
pitch_angle, store_ypr,
custom_tag_artist,
@@ -618,7 +651,7 @@ public class ImageSaver extends Thread {
null,
raw_image,
false, null,
- false,
+ false, false,
Request.ImageFormat.STD, 0,
false, 0.0,
false,
@@ -628,7 +661,9 @@ public class ImageSaver extends Thread {
0,
0,
1.0f,
- null, null, 0, 0, null, null, null, null, null, null,
+ null, null, 0, 0, null, null, null, null,
+ //null,
+ null,
false, false, null, false, 0.0,
0.0, false,
null, null,
@@ -644,7 +679,7 @@ public class ImageSaver extends Thread {
Request.ProcessType processType,
Request.SaveBase save_base,
boolean image_capture_intent, Uri image_capture_intent_uri,
- boolean using_camera2,
+ boolean using_camera2, boolean using_camera_extensions,
Request.ImageFormat image_format, int image_quality,
boolean do_auto_stabilise, double level_angle, boolean want_gyro_matrices,
boolean is_front_facing,
@@ -653,7 +688,9 @@ public class ImageSaver extends Thread {
int iso,
long exposure_time,
float zoom_factor,
- String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat, String preference_stamp_geo_address, String preference_units_distance,
+ String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat,
+ //String preference_stamp_geo_address,
+ String preference_units_distance,
boolean panorama_crop,
boolean store_location, Location location, boolean store_geo_direction, double geo_direction,
double pitch_angle, boolean store_ypr,
@@ -669,12 +706,12 @@ public class ImageSaver extends Thread {
false,
0,
save_base,
- new ArrayList(),
+ new ArrayList<>(),
null,
image_capture_intent, image_capture_intent_uri,
- using_camera2,
+ using_camera2, using_camera_extensions,
image_format, image_quality,
- do_auto_stabilise, level_angle, want_gyro_matrices ? new ArrayList() : null,
+ do_auto_stabilise, level_angle, want_gyro_matrices ? new ArrayList<>() : null,
is_front_facing,
mirror,
current_date,
@@ -682,7 +719,9 @@ public class ImageSaver extends Thread {
iso,
exposure_time,
zoom_factor,
- preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat, preference_stamp_geo_address, preference_units_distance,
+ preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat,
+ //preference_stamp_geo_address,
+ preference_units_distance,
panorama_crop, store_location, location, store_geo_direction, geo_direction,
pitch_angle, store_ypr,
custom_tag_artist,
@@ -751,7 +790,7 @@ public class ImageSaver extends Thread {
List jpeg_images,
RawImage raw_image,
boolean image_capture_intent, Uri image_capture_intent_uri,
- boolean using_camera2,
+ boolean using_camera2, boolean using_camera_extensions,
Request.ImageFormat image_format, int image_quality,
boolean do_auto_stabilise, double level_angle,
boolean is_front_facing,
@@ -761,7 +800,9 @@ public class ImageSaver extends Thread {
int iso,
long exposure_time,
float zoom_factor,
- String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat, String preference_stamp_geo_address, String preference_units_distance,
+ String preference_stamp, String preference_textstamp, int font_size, int color, String pref_style, String preference_stamp_dateformat, String preference_stamp_timeformat, String preference_stamp_gpsformat,
+ //String preference_stamp_geo_address,
+ String preference_units_distance,
boolean panorama_crop,
boolean store_location, Location location, boolean store_geo_direction, double geo_direction,
double pitch_angle, boolean store_ypr,
@@ -784,7 +825,7 @@ public class ImageSaver extends Thread {
jpeg_images,
raw_image,
image_capture_intent, image_capture_intent_uri,
- using_camera2,
+ using_camera2, using_camera_extensions,
image_format, image_quality,
do_auto_stabilise, level_angle, null,
is_front_facing,
@@ -794,7 +835,9 @@ public class ImageSaver extends Thread {
iso,
exposure_time,
zoom_factor,
- preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat, preference_stamp_geo_address, preference_units_distance,
+ preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat,
+ //preference_stamp_geo_address,
+ preference_units_distance,
panorama_crop, store_location, location, store_geo_direction, geo_direction,
pitch_angle, store_ypr,
custom_tag_artist,
@@ -895,7 +938,7 @@ public class ImageSaver extends Thread {
null,
null,
false, null,
- false,
+ false, false,
Request.ImageFormat.STD, 0,
false, 0.0, null,
false,
@@ -905,7 +948,9 @@ public class ImageSaver extends Thread {
0,
0,
1.0f,
- null, null, 0, 0, null, null, null, null, null, null,
+ null, null, 0, 0, null, null, null, null,
+ //null,
+ null,
false, false, null, false, 0.0,
0.0, false,
null, null,
@@ -1095,7 +1140,7 @@ public class ImageSaver extends Thread {
default:
// Using local contrast enhancement helps scenes where the dynamic range is very large, which tends to be when we choose
// a short exposure time, due to fixing problems where some regions are too dark.
- // This helps: testHDR11, testHDR19, testHDR34, testHDR53.
+ // This helps: testHDR11, testHDR19, testHDR34, testHDR53, testHDR61.
// Using local contrast enhancement in all cases can increase noise in darker scenes. This problem would occur
// (if we used local contrast enhancement) is: testHDR2, testHDR12, testHDR17, testHDR43, testHDR50, testHDR51,
// testHDR54, testHDR55, testHDR56.
@@ -1179,7 +1224,6 @@ public class ImageSaver extends Thread {
@SuppressWarnings("WeakerAccess")
public static class GyroDebugInfo {
- @SuppressWarnings("unused")
public static class GyroImageDebugInfo {
public float [] vectorRight; // X axis
public float [] vectorUp; // Y axis
@@ -1341,7 +1385,7 @@ public class ImageSaver extends Thread {
long time_s = System.currentTimeMillis();
// initialise allocation from first two bitmaps
//int inSampleSize = hdrProcessor.getAvgSampleSize(request.jpeg_images.size());
- int inSampleSize = hdrProcessor.getAvgSampleSize(request.iso);
+ int inSampleSize = hdrProcessor.getAvgSampleSize(request.iso, request.exposure_time);
//final boolean use_smp = false;
final boolean use_smp = true;
// n_smp_images is how many bitmaps to decompress at once if use_smp==true. Beware of setting too high -
@@ -1386,7 +1430,7 @@ public class ImageSaver extends Thread {
int height = bitmap0.getHeight();
float avg_factor = 1.0f;
this_time_s = System.currentTimeMillis();
- HDRProcessor.AvgData avg_data = hdrProcessor.processAvg(bitmap0, bitmap1, avg_factor, request.iso, request.zoom_factor);
+ HDRProcessor.AvgData avg_data = hdrProcessor.processAvg(bitmap0, bitmap1, avg_factor, request.iso, request.exposure_time, request.zoom_factor);
if( bitmaps != null ) {
bitmaps.set(0, null);
bitmaps.set(1, null);
@@ -1438,7 +1482,7 @@ public class ImageSaver extends Thread {
}
avg_factor = (float)i;
this_time_s = System.currentTimeMillis();
- hdrProcessor.updateAvg(avg_data, width, height, new_bitmap, avg_factor, request.iso, request.zoom_factor);
+ hdrProcessor.updateAvg(avg_data, width, height, new_bitmap, avg_factor, request.iso, request.exposure_time, request.zoom_factor);
// updateAvg recycles new_bitmap
if( bitmaps != null ) {
bitmaps.set(i, null);
@@ -1617,7 +1661,7 @@ public class ImageSaver extends Thread {
writeGyroDebugXml(writer, request);
StorageUtils storageUtils = main_activity.getStorageUtils();
- File saveFile = null;
+ /*File saveFile = null;
Uri saveUri = null;
if( storageUtils.isUsingSAF() ) {
saveUri = storageUtils.createOutputMediaFileSAF(StorageUtils.MEDIA_TYPE_GYRO_INFO, "", "xml", request.current_date);
@@ -1626,7 +1670,16 @@ public class ImageSaver extends Thread {
saveFile = storageUtils.createOutputMediaFile(StorageUtils.MEDIA_TYPE_GYRO_INFO, "", "xml", request.current_date);
if( MyDebug.LOG )
Log.d(TAG, "save to: " + saveFile.getAbsolutePath());
- }
+ }*/
+ // We save to the application specific folder so this works on Android 10 with scoped storage, without having to
+ // rewrite the non-SAF codepath to use MediaStore API (which would also have problems that the gyro debug files would
+ // show up in the MediaStore, hence gallery applications!)
+ // We use this for older Android versions for consistency, plus not a bad idea of to have debug files in the application
+ // folder anyway.
+ File saveFile = storageUtils.createOutputMediaFile(main_activity.getExternalFilesDir(null), StorageUtils.MEDIA_TYPE_GYRO_INFO, "", "xml", request.current_date);
+ Uri saveUri = null;
+ if( MyDebug.LOG )
+ Log.d(TAG, "save to: " + saveFile.getAbsolutePath());
OutputStream outputStream;
if( saveFile != null )
@@ -1646,7 +1699,7 @@ public class ImageSaver extends Thread {
storageUtils.broadcastFile(saveFile, false, false, false);
}
else {
- broadcastSAFFile(saveUri, false);
+ broadcastSAFFile(saveUri, false, false);
}
}
catch(IOException e) {
@@ -1685,10 +1738,9 @@ public class ImageSaver extends Thread {
}
// rotate the bitmaps if necessary for exif tags
- File exifTempFile = getExifTempFile(request.jpeg_images.get(0));
for(int i=0;i 0;
if( dategeo_stamp || text_stamp ) {
if( bitmap == null ) {
if( MyDebug.LOG )
Log.d(TAG, "decode bitmap in order to stamp info");
- bitmap = loadBitmapWithRotation(data, true, exifTempFile);
+ bitmap = loadBitmapWithRotation(data, true);
if( bitmap == null ) {
main_activity.getPreview().showToast(null, R.string.failed_to_stamp);
System.gc();
@@ -2068,6 +2117,12 @@ public class ImageSaver extends Thread {
Log.d(TAG, "stamp info to bitmap: " + bitmap);
if( MyDebug.LOG )
Log.d(TAG, "bitmap is mutable?: " + bitmap.isMutable());
+
+ String stamp_string = "";
+ /* We now stamp via a TextView instead of using MyApplicationInterface.drawTextWithBackground().
+ * This is important in order to satisfy the Google emoji policy...
+ */
+
int font_size = request.font_size;
int color = request.color;
String pref_style = request.pref_style;
@@ -2133,18 +2188,32 @@ public class ImageSaver extends Thread {
datetime_stamp += " ";
datetime_stamp += time_stamp;
}
- applicationInterface.drawTextWithBackground(canvas, p, datetime_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ //applicationInterface.drawTextWithBackground(canvas, p, datetime_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ if( stamp_string.length() == 0 )
+ stamp_string = datetime_stamp;
+ else
+ stamp_string = datetime_stamp + "\n" + stamp_string;
}
ypos -= diff_y;
String gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, request.preference_units_distance, request.store_location, request.location, request.store_geo_direction, request.geo_direction);
if( gps_stamp.length() > 0 ) {
// don't log gps_stamp, in case of privacy!
- Address address = null;
+ /*Address address = null;
if( request.store_location && !request.preference_stamp_geo_address.equals("preference_stamp_geo_address_no") ) {
+ boolean block_geocoder;
+ synchronized(this) {
+ block_geocoder = app_is_paused;
+ }
// try to find an address
// n.b., if we update the class being used, consider whether the info on Geocoder in preference_stamp_geo_address_summary needs updating
- if( Geocoder.isPresent() ) {
+ if( block_geocoder ) {
+ // seems safer to not try to initiate potential network connections (via geocoder) if Open Camera
+ // has paused and we're still saving images
+ if( MyDebug.LOG )
+ Log.d(TAG, "don't call geocoder for photostamp as app is paused");
+ }
+ else if( Geocoder.isPresent() ) {
if( MyDebug.LOG )
Log.d(TAG, "geocoder is present");
Geocoder geocoder = new Geocoder(main_activity, Locale.getDefault());
@@ -2167,45 +2236,96 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "geocoder not present");
}
- }
+ }*/
- if( address == null || request.preference_stamp_geo_address.equals("preference_stamp_geo_address_both") ) {
+ //if( address == null || request.preference_stamp_geo_address.equals("preference_stamp_geo_address_both") )
+ {
if( MyDebug.LOG )
Log.d(TAG, "display gps coords");
// want GPS coords (either in addition to the address, or we don't have an address)
// we'll also enter here if store_location is false, but we have geo direction to display
- applicationInterface.drawTextWithBackground(canvas, p, gps_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ //applicationInterface.drawTextWithBackground(canvas, p, gps_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ if( stamp_string.length() == 0 )
+ stamp_string = gps_stamp;
+ else
+ stamp_string = gps_stamp + "\n" + stamp_string;
ypos -= diff_y;
}
- else if( request.store_geo_direction ) {
+ /*else if( request.store_geo_direction ) {
if( MyDebug.LOG )
Log.d(TAG, "not displaying gps coords, but need to display geo direction");
// we are displaying an address instead of GPS coords, but we still need to display the geo direction
gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, request.preference_units_distance, false, null, request.store_geo_direction, request.geo_direction);
if( gps_stamp.length() > 0 ) {
// don't log gps_stamp, in case of privacy!
- applicationInterface.drawTextWithBackground(canvas, p, gps_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ //applicationInterface.drawTextWithBackground(canvas, p, gps_stamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ if( stamp_string.length() == 0 )
+ stamp_string = gps_stamp;
+ else
+ stamp_string = gps_stamp + "\n" + stamp_string;
ypos -= diff_y;
}
- }
+ }*/
- if( address != null ) {
+ /*if( address != null ) {
for(int i=0;i<=address.getMaxAddressLineIndex();i++) {
// write in reverse order
String addressLine = address.getAddressLine(address.getMaxAddressLineIndex()-i);
- applicationInterface.drawTextWithBackground(canvas, p, addressLine, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ //applicationInterface.drawTextWithBackground(canvas, p, addressLine, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ if( stamp_string.length() == 0 )
+ stamp_string = addressLine;
+ else
+ stamp_string = addressLine + "\n" + stamp_string;
ypos -= diff_y;
}
- }
+ }*/
}
}
if( text_stamp ) {
if( MyDebug.LOG )
Log.d(TAG, "stamp text");
- applicationInterface.drawTextWithBackground(canvas, p, request.preference_textstamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+
+ //applicationInterface.drawTextWithBackground(canvas, p, request.preference_textstamp, color, Color.BLACK, width - offset_x, ypos, MyApplicationInterface.Alignment.ALIGNMENT_BOTTOM, null, draw_shadowed);
+ if( stamp_string.length() == 0 )
+ stamp_string = request.preference_textstamp;
+ else
+ stamp_string = request.preference_textstamp + "\n" + stamp_string;
+
//noinspection UnusedAssignment
ypos -= diff_y;
}
+
+ if( stamp_string.length() > 0 ) {
+ // don't log stamp_string, in case of privacy!
+
+ @SuppressLint("InflateParams")
+ final View stamp_view = LayoutInflater.from(main_activity).inflate(R.layout.stamp_image_text, null);
+ final LinearLayout layout = stamp_view.findViewById(R.id.layout);
+ final TextView textview = stamp_view.findViewById(R.id.text_view);
+
+ textview.setVisibility(View.VISIBLE);
+ textview.setTextColor(color);
+ textview.setTextSize(TypedValue.COMPLEX_UNIT_PX, font_size_pixel);
+ textview.setText(stamp_string);
+ if( draw_shadowed == MyApplicationInterface.Shadow.SHADOW_OUTLINE ) {
+ //noinspection PointlessArithmeticExpression
+ float shadow_radius = (1.0f * scale + 0.5f); // convert pt to pixels
+ shadow_radius = Math.max(shadow_radius, 1.0f);
+ if( MyDebug.LOG )
+ Log.d(TAG, "shadow_radius: " + shadow_radius);
+ textview.setShadowLayer(shadow_radius, 0.0f, 0.0f, Color.BLACK);
+ }
+ else if( draw_shadowed == MyApplicationInterface.Shadow.SHADOW_BACKGROUND ) {
+ textview.setBackgroundColor(Color.argb(64, 0, 0, 0));
+ }
+ //textview.setBackgroundColor(Color.BLACK); // test
+ textview.setGravity(Gravity.END); // so text is right-aligned - important when there are multiple lines
+
+ layout.measure(canvas.getWidth(), canvas.getHeight());
+ layout.layout(0, 0, canvas.getWidth(), canvas.getHeight());
+ canvas.translate(width - offset_x - textview.getWidth(), height - offset_y - textview.getHeight());
+ layout.draw(canvas);
+ }
}
}
return bitmap;
@@ -2213,41 +2333,12 @@ public class ImageSaver extends Thread {
private static class PostProcessBitmapResult {
final Bitmap bitmap;
- final File exifTempFile;
- PostProcessBitmapResult(Bitmap bitmap, File exifTempFile) {
+ PostProcessBitmapResult(Bitmap bitmap) {
this.bitmap = bitmap;
- this.exifTempFile = exifTempFile;
}
}
- private File getExifTempFile(byte [] data) {
- File exifTempFile = null;
- // need to rotate the bitmap according to the exif orientation (which some devices use, e.g., Samsung)
- // so need to write to a temp file for this - we also use this later on to transfer the exif tags
- // on Android 7+, we can now read exif tags direct from the jpeg data
- if( Build.VERSION.SDK_INT < Build.VERSION_CODES.N ) {
- try {
- if( MyDebug.LOG )
- Log.d(TAG, "write temp file to record EXIF data");
- exifTempFile = File.createTempFile("opencamera_exif", "");
- OutputStream tempOutputStream = new FileOutputStream(exifTempFile);
- try {
- tempOutputStream.write(data);
- }
- finally {
- tempOutputStream.close();
- }
- }
- catch(IOException e) {
- if (MyDebug.LOG)
- Log.e(TAG, "exception writing to temp file");
- e.printStackTrace();
- }
- }
- return exifTempFile;
- }
-
/** Performs post-processing on the data, or bitmap if non-null, for saveSingleImageNow.
*/
private PostProcessBitmapResult postProcessBitmap(final Request request, byte [] data, Bitmap bitmap, boolean ignore_exif_orientation) throws IOException {
@@ -2257,51 +2348,41 @@ public class ImageSaver extends Thread {
boolean dategeo_stamp = request.preference_stamp.equals("preference_stamp_yes");
boolean text_stamp = request.preference_textstamp.length() > 0;
- File exifTempFile = null;
if( bitmap != null || request.image_format != Request.ImageFormat.STD || request.do_auto_stabilise || request.mirror || dategeo_stamp || text_stamp ) {
// either we have a bitmap, or will need to decode the bitmap to do post-processing
if( !ignore_exif_orientation ) {
- exifTempFile = getExifTempFile(data);
- if( MyDebug.LOG ) {
- Log.d(TAG, "Save single image performance: time after saving temp photo for EXIF: " + (System.currentTimeMillis() - time_s));
- }
-
if( bitmap != null ) {
// rotate the bitmap if necessary for exif tags
if( MyDebug.LOG )
Log.d(TAG, "rotate pre-existing bitmap for exif tags?");
- bitmap = rotateForExif(bitmap, data, exifTempFile);
+ bitmap = rotateForExif(bitmap, data);
}
}
}
if( request.do_auto_stabilise ) {
- bitmap = autoStabilise(data, bitmap, request.level_angle, request.is_front_facing, exifTempFile);
+ bitmap = autoStabilise(data, bitmap, request.level_angle, request.is_front_facing);
}
if( MyDebug.LOG ) {
Log.d(TAG, "Save single image performance: time after auto-stabilise: " + (System.currentTimeMillis() - time_s));
}
if( request.mirror ) {
- bitmap = mirrorImage(data, bitmap, exifTempFile);
+ bitmap = mirrorImage(data, bitmap);
}
if( request.image_format != Request.ImageFormat.STD && bitmap == null ) {
if( MyDebug.LOG )
Log.d(TAG, "need to decode bitmap to convert file format");
- bitmap = loadBitmapWithRotation(data, true, exifTempFile);
+ bitmap = loadBitmapWithRotation(data, true);
if( bitmap == null ) {
// if we can't load bitmap for converting file formats, don't want to continue
System.gc();
- if( exifTempFile != null && !exifTempFile.delete() ) {
- if( MyDebug.LOG )
- Log.e(TAG, "failed to delete temp " + exifTempFile.getAbsolutePath());
- }
throw new IOException();
}
}
- bitmap = stampImage(request, data, bitmap, exifTempFile);
+ bitmap = stampImage(request, data, bitmap);
if( MyDebug.LOG ) {
Log.d(TAG, "Save single image performance: time after photostamp: " + (System.currentTimeMillis() - time_s));
}
- return new PostProcessBitmapResult(bitmap, exifTempFile);
+ return new PostProcessBitmapResult(bitmap);
}
/** May be run in saver thread or picture callback thread (depending on whether running in background).
@@ -2360,18 +2441,16 @@ public class ImageSaver extends Thread {
main_activity.savingImage(true);
- File exifTempFile = null;
-
- // If saveUri is non-null, then:
- // Before Android 7, picFile is a temporary file which we use for saving exif tags too, and then we redirect the picFile to saveUri.
- // On Android 7+, picFile is null - we can write the exif tags direct to the saveUri.
+ // If using SAF or image_capture_intent is true, or using scoped storage, only saveUri is non-null
+ // Otherwise, only picFile is non-null
File picFile = null;
Uri saveUri = null;
+ boolean use_media_store = false;
+ ContentValues contentValues = null; // used if using scoped storage
try {
if( !raw_only ) {
PostProcessBitmapResult postProcessBitmapResult = postProcessBitmap(request, data, bitmap, ignore_exif_orientation);
bitmap = postProcessBitmapResult.bitmap;
- exifTempFile = postProcessBitmapResult.exifTempFile;
}
if( raw_only ) {
@@ -2398,7 +2477,7 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "create bitmap");
// bitmap we return doesn't need to be mutable
- bitmap = loadBitmapWithRotation(data, false, exifTempFile);
+ bitmap = loadBitmapWithRotation(data, false);
}
if( bitmap != null ) {
int width = bitmap.getWidth();
@@ -2433,17 +2512,64 @@ public class ImageSaver extends Thread {
}
if( bitmap != null )
main_activity.setResult(Activity.RESULT_OK, new Intent("inline-data").putExtra("data", bitmap));
- if( exifTempFile != null && !exifTempFile.delete() ) {
- if( MyDebug.LOG )
- Log.e(TAG, "failed to delete temp " + exifTempFile.getAbsolutePath());
- }
- exifTempFile = null;
main_activity.finish();
}
}
else if( storageUtils.isUsingSAF() ) {
saveUri = storageUtils.createOutputMediaFileSAF(StorageUtils.MEDIA_TYPE_IMAGE, filename_suffix, extension, request.current_date);
}
+ else if( MainActivity.useScopedStorage() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "use media store");
+ use_media_store = true;
+ Uri folder = Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ?
+ MediaStore.Images.Media.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY) :
+ MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
+ contentValues = new ContentValues();
+ String picName = storageUtils.createMediaFilename(StorageUtils.MEDIA_TYPE_IMAGE, filename_suffix, 0, "." + extension, request.current_date);
+ if( MyDebug.LOG )
+ Log.d(TAG, "picName: " + picName);
+ contentValues.put(MediaStore.Images.Media.DISPLAY_NAME, picName);
+ String mime_type = storageUtils.getImageMimeType(extension);
+ if( MyDebug.LOG )
+ Log.d(TAG, "mime_type: " + mime_type);
+ contentValues.put(MediaStore.Images.Media.MIME_TYPE, mime_type);
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ String relative_path = storageUtils.getSaveRelativeFolder();
+ if( MyDebug.LOG )
+ Log.d(TAG, "relative_path: " + relative_path);
+ contentValues.put(MediaStore.Images.Media.RELATIVE_PATH, relative_path);
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 1);
+ }
+
+ // Note, we catch exceptions specific to insert() here and rethrow as IOException,
+ // rather than catching below, to avoid catching things too broadly - e.g.,
+ // IllegalStateException can also be thrown via "new Canvas" (from
+ // postProcessBitmap()) but this is a programming error that we shouldn't catch.
+ // Catching too broadly could mean we miss genuine problems that should be fixed.
+ try {
+ saveUri = main_activity.getContentResolver().insert(folder, contentValues);
+ }
+ catch(IllegalArgumentException e) {
+ // can happen for mediastore method if invalid ContentResolver.insert() call
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalArgumentException inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ catch(IllegalStateException e) {
+ // have received Google Play crashes from ContentResolver.insert() call for mediastore method
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalStateException inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "saveUri: " + saveUri);
+ if( saveUri == null ) {
+ throw new IOException();
+ }
+ }
else {
picFile = storageUtils.createOutputMediaFile(StorageUtils.MEDIA_TYPE_IMAGE, filename_suffix, extension, request.current_date);
if( MyDebug.LOG )
@@ -2452,11 +2578,6 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "saveUri: " + saveUri);
- if( saveUri != null && picFile == null && Build.VERSION.SDK_INT < Build.VERSION_CODES.N ) {
- picFile = File.createTempFile("picFile", "jpg", main_activity.getCacheDir());
- if( MyDebug.LOG )
- Log.d(TAG, "temp picFile: " + picFile.getAbsolutePath());
- }
if( picFile != null || saveUri != null ) {
OutputStream outputStream;
@@ -2503,14 +2624,14 @@ public class ImageSaver extends Thread {
// handle transferring/setting Exif tags (JPEG format only)
if( bitmap != null ) {
// need to update EXIF data! (only supported for JPEG image formats)
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
- if( MyDebug.LOG )
- Log.d(TAG, "set Exif tags from data");
- if( picFile != null ) {
- setExifFromData(request, data, picFile);
- }
- else {
- ParcelFileDescriptor parcelFileDescriptor = main_activity.getContentResolver().openFileDescriptor(saveUri, "rw");
+ if( MyDebug.LOG )
+ Log.d(TAG, "set Exif tags from data");
+ if( picFile != null ) {
+ setExifFromData(request, data, picFile);
+ }
+ else {
+ ParcelFileDescriptor parcelFileDescriptor = main_activity.getContentResolver().openFileDescriptor(saveUri, "rw");
+ try {
if( parcelFileDescriptor != null ) {
FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor();
setExifFromData(request, data, fileDescriptor);
@@ -2519,23 +2640,16 @@ public class ImageSaver extends Thread {
Log.e(TAG, "failed to create ParcelFileDescriptor for saveUri: " + saveUri);
}
}
- }
- else {
- if( MyDebug.LOG )
- Log.d(TAG, "set Exif tags from file");
- if( picFile == null ) {
- throw new RuntimeException("should have set picFile on pre-Android 7!");
- }
- if( exifTempFile != null ) {
- setExifFromFile(request, exifTempFile, picFile);
- if( MyDebug.LOG ) {
- Log.d(TAG, "Save single image performance: time after copying EXIF: " + (System.currentTimeMillis() - time_s));
+ finally {
+ if( parcelFileDescriptor != null ) {
+ try {
+ parcelFileDescriptor.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
}
}
- else {
- if( MyDebug.LOG )
- Log.d(TAG, "can't set Exif tags without file pre-Android 7");
- }
}
}
else {
@@ -2546,6 +2660,11 @@ public class ImageSaver extends Thread {
}
}
+ if( update_thumbnail ) {
+ // clear just in case we're unable to update this - don't want an out of date cached uri
+ storageUtils.clearLastMediaScanned();
+ }
+
if( picFile != null && saveUri == null ) {
// broadcast for SAF is done later, when we've actually written out the file
storageUtils.broadcastFile(picFile, true, false, update_thumbnail);
@@ -2558,17 +2677,36 @@ public class ImageSaver extends Thread {
main_activity.setResult(Activity.RESULT_OK);
main_activity.finish();
}
- if( storageUtils.isUsingSAF() ) {
- // most Gallery apps don't seem to recognise the SAF-format Uri, so just clear the field
- storageUtils.clearLastMediaScanned();
- }
if( saveUri != null ) {
- if( picFile != null ) {
- copyFileToUri(main_activity, saveUri, picFile);
- }
success = true;
- broadcastSAFFile(saveUri, request.image_capture_intent);
+
+ if( use_media_store ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ contentValues.clear();
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 0);
+ main_activity.getContentResolver().update(saveUri, contentValues, null, null);
+ }
+
+ // no need to broadcast when using mediastore method
+ if( !request.image_capture_intent ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "announce mediastore uri");
+ // in theory this is pointless, as announceUri no longer does anything on Android 7+,
+ // and mediastore method is only used on Android 10+, but keep this just in case
+ // announceUri does something in future
+ storageUtils.announceUri(saveUri, true, false);
+ if( update_thumbnail ) {
+ // we also want to save the uri - we can use the media uri directly, rather than having to scan it
+ storageUtils.setLastMediaScanned(saveUri, false);
+ }
+ }
+ }
+ else {
+ broadcastSAFFile(saveUri, update_thumbnail, request.image_capture_intent);
+ }
+
+ main_activity.test_last_saved_imageuri = saveUri;
}
}
}
@@ -2586,17 +2724,13 @@ public class ImageSaver extends Thread {
}
catch(SecurityException e) {
// received security exception from copyFileToUri()->openOutputStream() from Google Play
+ // update: no longer have copyFileToUri() (as no longer use temporary files for SAF), but might as well keep this
if( MyDebug.LOG )
Log.e(TAG, "security exception writing file: " + e.getMessage());
e.printStackTrace();
main_activity.getPreview().showToast(null, R.string.failed_to_save_photo);
}
- if( exifTempFile != null && !exifTempFile.delete() ) {
- if( MyDebug.LOG )
- Log.e(TAG, "failed to delete temp " + exifTempFile.getAbsolutePath());
- }
-
if( raw_only ) {
// no saved image to record
}
@@ -2606,6 +2740,9 @@ public class ImageSaver extends Thread {
else if( success && storageUtils.isUsingSAF() ){
applicationInterface.addLastImageSAF(saveUri, share_image);
}
+ else if( success && use_media_store ){
+ applicationInterface.addLastImageMediaStore(saveUri, share_image);
+ }
// I have received crashes where camera_controller was null - could perhaps happen if this thread was running just as the camera is closing?
if( success && main_activity.getPreview().getCameraController() != null && update_thumbnail ) {
@@ -2637,7 +2774,7 @@ public class ImageSaver extends Thread {
// now get the rotation from the Exif data
if( MyDebug.LOG )
Log.d(TAG, "rotate thumbnail for exif tags?");
- thumbnail = rotateForExif(thumbnail, data, picFile);
+ thumbnail = rotateForExif(thumbnail, data);
}
else {
int width = bitmap.getWidth();
@@ -2689,15 +2826,6 @@ public class ImageSaver extends Thread {
bitmap.recycle();
}
- if( picFile != null && saveUri != null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "delete temp picFile: " + picFile);
- if( !picFile.delete() ) {
- if( MyDebug.LOG )
- Log.e(TAG, "failed to delete temp picFile: " + picFile);
- }
- }
-
System.gc();
main_activity.savingImage(false);
@@ -2710,7 +2838,6 @@ public class ImageSaver extends Thread {
/** As setExifFromFile, but can read the Exif tags directly from the jpeg data rather than a file.
*/
- @RequiresApi(api = Build.VERSION_CODES.N)
private void setExifFromData(final Request request, byte [] data, File to_file) throws IOException {
if( MyDebug.LOG ) {
Log.d(TAG, "setExifFromData");
@@ -2730,42 +2857,15 @@ public class ImageSaver extends Thread {
}
}
- private void broadcastSAFFile(Uri saveUri, boolean image_capture_intent) {
+ private void broadcastSAFFile(Uri saveUri, boolean set_last_scanned, boolean image_capture_intent) {
if( MyDebug.LOG )
Log.d(TAG, "broadcastSAFFile");
- /* We still need to broadcastFile for SAF for two reasons:
- 1. To call storageUtils.announceUri() to broadcast NEW_PICTURE etc.
- Whilst in theory we could do this directly, it seems external apps that use such broadcasts typically
- won't know what to do with a SAF based Uri (e.g, Owncloud crashes!) so better to broadcast the Uri
- corresponding to the real file, if it exists.
- 2. Whilst the new file seems to be known by external apps such as Gallery without having to call media
- scanner, I've had reports this doesn't happen when saving to external SD cards. So better to explicitly
- scan.
- Note this will no longer work on Android Q's scoped storage (getFileFromDocumentUriSAF will return null).
- But NEW_PICTURE etc are no longer sent on Android 7+ anyway.
- */
StorageUtils storageUtils = main_activity.getStorageUtils();
- File real_file = storageUtils.getFileFromDocumentUriSAF(saveUri, false);
- if( MyDebug.LOG )
- Log.d(TAG, "real_file: " + real_file);
- if( real_file != null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "broadcast file");
- storageUtils.broadcastFile(real_file, true, false, true);
- main_activity.test_last_saved_image = real_file.getAbsolutePath();
- }
- else if( !image_capture_intent ) {
- if( MyDebug.LOG )
- Log.d(TAG, "announce SAF uri");
- // announce the SAF Uri
- // (shouldn't do this for a capture intent - e.g., causes crash when calling from Google Keep)
- storageUtils.announceUri(saveUri, true, false);
- }
+ storageUtils.broadcastUri(saveUri, true, false, set_last_scanned, image_capture_intent);
}
/** As setExifFromFile, but can read the Exif tags directly from the jpeg data, and to a file descriptor, rather than a file.
*/
- @RequiresApi(api = Build.VERSION_CODES.N)
private void setExifFromData(final Request request, byte [] data, FileDescriptor to_file_descriptor) throws IOException {
if( MyDebug.LOG ) {
Log.d(TAG, "setExifFromData");
@@ -2785,31 +2885,6 @@ public class ImageSaver extends Thread {
}
}
- /** Used to transfer exif tags, if we had to convert the jpeg info to a bitmap (for post-processing such as
- * auto-stabilise or photo stamp). Also then applies the Exif tags according to the preferences in the request.
- * Note that we use several ExifInterface tags that are now deprecated in API level 23 and 24. These are replaced with new tags that have
- * the same string value (e.g., TAG_APERTURE replaced with TAG_F_NUMBER, but both have value "FNumber"). We use the deprecated versions
- * to avoid complicating the code (we'd still have to read the deprecated values for older devices).
- */
- private void setExifFromFile(final Request request, File from_file, File to_file) throws IOException {
- if( MyDebug.LOG ) {
- Log.d(TAG, "setExifFromFile");
- Log.d(TAG, "from_file: " + from_file);
- Log.d(TAG, "to_file: " + to_file);
- }
- try {
- ExifInterface exif = new ExifInterface(from_file.getAbsolutePath());
- ExifInterface exif_new = new ExifInterface(to_file.getAbsolutePath());
- setExif(request, exif, exif_new);
- }
- catch(NoClassDefFoundError exception) {
- // have had Google Play crashes from new ExifInterface() for Galaxy Ace4 (vivalto3g)
- if( MyDebug.LOG )
- Log.e(TAG, "exif orientation NoClassDefFoundError");
- exception.printStackTrace();
- }
- }
-
/** Transfers exif tags from exif to exif_new, and then applies any extra Exif tags according to the preferences in the request.
* Note that we use several ExifInterface tags that are now deprecated in API level 23 and 24. These are replaced with new tags that have
* the same string value (e.g., TAG_APERTURE replaced with TAG_F_NUMBER, but both have value "FNumber"). We use the deprecated versions
@@ -2821,7 +2896,7 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "read back EXIF data");
- String exif_aperture = exif.getAttribute(ExifInterface.TAG_APERTURE); // same as TAG_F_NUMBER
+ String exif_aperture = exif.getAttribute(ExifInterface.TAG_F_NUMBER); // previously TAG_APERTURE
String exif_datetime = exif.getAttribute(ExifInterface.TAG_DATETIME);
String exif_exposure_time = exif.getAttribute(ExifInterface.TAG_EXPOSURE_TIME);
String exif_flash = exif.getAttribute(ExifInterface.TAG_FLASH);
@@ -2836,66 +2911,69 @@ public class ImageSaver extends Thread {
String exif_gps_processing_method = exif.getAttribute(ExifInterface.TAG_GPS_PROCESSING_METHOD);
String exif_gps_timestamp = exif.getAttribute(ExifInterface.TAG_GPS_TIMESTAMP);
// leave width/height, as this may have changed! similarly TAG_IMAGE_LENGTH?
- String exif_iso = exif.getAttribute(ExifInterface.TAG_ISO); // same as TAG_ISO_SPEED_RATINGS
+ //noinspection deprecation
+ String exif_iso = exif.getAttribute(ExifInterface.TAG_ISO_SPEED_RATINGS); // previously TAG_ISO
String exif_make = exif.getAttribute(ExifInterface.TAG_MAKE);
String exif_model = exif.getAttribute(ExifInterface.TAG_MODEL);
// leave orientation - since we rotate bitmaps to account for orientation, we don't want to write it to the saved image!
String exif_white_balance = exif.getAttribute(ExifInterface.TAG_WHITE_BALANCE);
- String exif_datetime_digitized = null;
- String exif_subsec_time = null;
- String exif_subsec_time_dig = null;
- String exif_subsec_time_orig = null;
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
+ String exif_datetime_digitized;
+ String exif_subsec_time;
+ String exif_subsec_time_dig;
+ String exif_subsec_time_orig;
+ {
// tags that are new in Android M - note we skip tags unlikely to be relevant for camera photos
+ // update, now available in all Android versions thanks to using AndroidX ExifInterface
exif_datetime_digitized = exif.getAttribute(ExifInterface.TAG_DATETIME_DIGITIZED);
exif_subsec_time = exif.getAttribute(ExifInterface.TAG_SUBSEC_TIME);
- exif_subsec_time_dig = exif.getAttribute(ExifInterface.TAG_SUBSEC_TIME_DIG); // same as TAG_SUBSEC_TIME_DIGITIZED
- exif_subsec_time_orig = exif.getAttribute(ExifInterface.TAG_SUBSEC_TIME_ORIG); // same as TAG_SUBSEC_TIME_ORIGINAL
- }
-
- String exif_aperture_value = null;
- String exif_brightness_value = null;
- String exif_cfa_pattern = null;
- String exif_color_space = null;
- String exif_components_configuration = null;
- String exif_compressed_bits_per_pixel = null;
- String exif_compression = null;
- String exif_contrast = null;
- String exif_datetime_original = null;
- String exif_device_setting_description = null;
- String exif_digital_zoom_ratio = null;
- String exif_exposure_bias_value = null;
- String exif_exposure_index = null;
- String exif_exposure_mode = null;
- String exif_exposure_program = null;
- String exif_flash_energy = null;
- String exif_focal_length_in_35mm_film = null;
- String exif_focal_plane_resolution_unit = null;
- String exif_focal_plane_x_resolution = null;
- String exif_focal_plane_y_resolution = null;
- String exif_gain_control = null;
- String exif_gps_area_information = null;
- String exif_gps_differential = null;
- String exif_gps_dop = null;
- String exif_gps_measure_mode = null;
- String exif_image_description = null;
- String exif_light_source = null;
- String exif_maker_note = null;
- String exif_max_aperture_value = null;
- String exif_metering_mode = null;
- String exif_oecf = null;
- String exif_photometric_interpretation = null;
- String exif_saturation = null;
- String exif_scene_capture_type = null;
- String exif_scene_type = null;
- String exif_sensing_method = null;
- String exif_sharpness = null;
- String exif_shutter_speed_value = null;
- String exif_software = null;
- String exif_user_comment = null;
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
+ exif_subsec_time_dig = exif.getAttribute(ExifInterface.TAG_SUBSEC_TIME_DIGITIZED); // previously TAG_SUBSEC_TIME_DIG
+ exif_subsec_time_orig = exif.getAttribute(ExifInterface.TAG_SUBSEC_TIME_ORIGINAL); // previously TAG_SUBSEC_TIME_ORIG
+ }
+
+ String exif_aperture_value;
+ String exif_brightness_value;
+ String exif_cfa_pattern;
+ String exif_color_space;
+ String exif_components_configuration;
+ String exif_compressed_bits_per_pixel;
+ String exif_compression;
+ String exif_contrast;
+ String exif_datetime_original;
+ String exif_device_setting_description;
+ String exif_digital_zoom_ratio;
+ String exif_exposure_bias_value;
+ String exif_exposure_index;
+ String exif_exposure_mode;
+ String exif_exposure_program;
+ String exif_flash_energy;
+ String exif_focal_length_in_35mm_film;
+ String exif_focal_plane_resolution_unit;
+ String exif_focal_plane_x_resolution;
+ String exif_focal_plane_y_resolution;
+ String exif_gain_control;
+ String exif_gps_area_information;
+ String exif_gps_differential;
+ String exif_gps_dop;
+ String exif_gps_measure_mode;
+ String exif_image_description;
+ String exif_light_source;
+ String exif_maker_note;
+ String exif_max_aperture_value;
+ String exif_metering_mode;
+ String exif_oecf;
+ String exif_photometric_interpretation;
+ String exif_saturation;
+ String exif_scene_capture_type;
+ String exif_scene_type;
+ String exif_sensing_method;
+ String exif_sharpness;
+ String exif_shutter_speed_value;
+ String exif_software;
+ String exif_user_comment;
+ {
// tags that are new in Android N - note we skip tags unlikely to be relevant for camera photos
+ // update, now available in all Android versions thanks to using AndroidX ExifInterface
exif_aperture_value = exif.getAttribute(ExifInterface.TAG_APERTURE_VALUE);
exif_brightness_value = exif.getAttribute(ExifInterface.TAG_BRIGHTNESS_VALUE);
exif_cfa_pattern = exif.getAttribute(ExifInterface.TAG_CFA_PATTERN);
@@ -2967,7 +3045,7 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "now write new EXIF data");
if( exif_aperture != null )
- exif_new.setAttribute(ExifInterface.TAG_APERTURE, exif_aperture);
+ exif_new.setAttribute(ExifInterface.TAG_F_NUMBER, exif_aperture);
if( exif_datetime != null )
exif_new.setAttribute(ExifInterface.TAG_DATETIME, exif_datetime);
if( exif_exposure_time != null )
@@ -2995,7 +3073,8 @@ public class ImageSaver extends Thread {
if( exif_gps_timestamp != null )
exif_new.setAttribute(ExifInterface.TAG_GPS_TIMESTAMP, exif_gps_timestamp);
if( exif_iso != null )
- exif_new.setAttribute(ExifInterface.TAG_ISO, exif_iso);
+ //noinspection deprecation
+ exif_new.setAttribute(ExifInterface.TAG_ISO_SPEED_RATINGS, exif_iso);
if( exif_make != null )
exif_new.setAttribute(ExifInterface.TAG_MAKE, exif_make);
if( exif_model != null )
@@ -3003,18 +3082,18 @@ public class ImageSaver extends Thread {
if( exif_white_balance != null )
exif_new.setAttribute(ExifInterface.TAG_WHITE_BALANCE, exif_white_balance);
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
+ {
if( exif_datetime_digitized != null )
exif_new.setAttribute(ExifInterface.TAG_DATETIME_DIGITIZED, exif_datetime_digitized);
if( exif_subsec_time != null )
exif_new.setAttribute(ExifInterface.TAG_SUBSEC_TIME, exif_subsec_time);
if( exif_subsec_time_dig != null )
- exif_new.setAttribute(ExifInterface.TAG_SUBSEC_TIME_DIG, exif_subsec_time_dig);
+ exif_new.setAttribute(ExifInterface.TAG_SUBSEC_TIME_DIGITIZED, exif_subsec_time_dig);
if( exif_subsec_time_orig != null )
- exif_new.setAttribute(ExifInterface.TAG_SUBSEC_TIME_ORIG, exif_subsec_time_orig);
+ exif_new.setAttribute(ExifInterface.TAG_SUBSEC_TIME_ORIGINAL, exif_subsec_time_orig);
}
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
+ {
if( exif_aperture_value != null )
exif_new.setAttribute(ExifInterface.TAG_APERTURE_VALUE, exif_aperture_value);
if( exif_brightness_value != null )
@@ -3097,7 +3176,7 @@ public class ImageSaver extends Thread {
exif_new.setAttribute(ExifInterface.TAG_USER_COMMENT, exif_user_comment);
}
- modifyExif(exif_new, request.type == Request.Type.JPEG, request.using_camera2, request.current_date, request.store_location, request.store_geo_direction, request.geo_direction, request.custom_tag_artist, request.custom_tag_copyright, request.level_angle, request.pitch_angle, request.store_ypr);
+ modifyExif(exif_new, request.type == Request.Type.JPEG, request.using_camera2, request.using_camera_extensions, request.current_date, request.store_location, request.location, request.store_geo_direction, request.geo_direction, request.custom_tag_artist, request.custom_tag_copyright, request.level_angle, request.pitch_angle, request.store_ypr);
setDateTimeExif(exif_new);
exif_new.saveAttributes();
}
@@ -3124,6 +3203,8 @@ public class ImageSaver extends Thread {
try {
File picFile = null;
Uri saveUri = null;
+ boolean use_media_store = false;
+ ContentValues contentValues = null; // used if using scoped storage
String suffix = "_";
String filename_suffix = (request.force_suffix) ? suffix + (request.suffix_offset) : "";
@@ -3134,6 +3215,44 @@ public class ImageSaver extends Thread {
// When using SAF, we don't save to a temp file first (unlike for JPEGs). Firstly we don't need to modify Exif, so don't
// need a real file; secondly copying to a temp file is much slower for RAW.
}
+ else if( MainActivity.useScopedStorage() ) {
+ use_media_store = true;
+ Uri folder = Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ?
+ MediaStore.Images.Media.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY) :
+ MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
+ contentValues = new ContentValues();
+ String picName = storageUtils.createMediaFilename(StorageUtils.MEDIA_TYPE_IMAGE, filename_suffix, 0, ".dng", request.current_date);
+ contentValues.put(MediaStore.Images.Media.DISPLAY_NAME, picName);
+ contentValues.put(MediaStore.Images.Media.MIME_TYPE, "image/dng");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ contentValues.put(MediaStore.Images.Media.RELATIVE_PATH, storageUtils.getSaveRelativeFolder());
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 1);
+ }
+
+ // Note, we catch exceptions specific to insert() here and rethrow as IOException,
+ // rather than catching below, to avoid catching things too broadly.
+ // Catching too broadly could mean we miss genuine problems that should be fixed.
+ try {
+ saveUri = main_activity.getContentResolver().insert(folder, contentValues);
+ }
+ catch(IllegalArgumentException e) {
+ // can happen for mediastore method if invalid ContentResolver.insert() call
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalArgumentException inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ catch(IllegalStateException e) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalStateException inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "saveUri: " + saveUri);
+ if( saveUri == null )
+ throw new IOException();
+ }
else {
picFile = storageUtils.createOutputMediaFile(StorageUtils.MEDIA_TYPE_IMAGE, filename_suffix, "dng", request.current_date);
if( MyDebug.LOG )
@@ -3167,12 +3286,40 @@ public class ImageSaver extends Thread {
else if( storageUtils.isUsingSAF() ){
applicationInterface.addLastImageSAF(saveUri, raw_only);
}
+ else if( success && use_media_store ){
+ applicationInterface.addLastImageMediaStore(saveUri, raw_only);
+ }
+
+ // if RAW only, need to update the cached uri
+ if( raw_only ) {
+ // clear just in case we're unable to update this - don't want an out of date cached uri
+ storageUtils.clearLastMediaScanned();
+ }
if( saveUri == null ) {
- storageUtils.broadcastFile(picFile, true, false, false);
+ storageUtils.broadcastFile(picFile, true, false, raw_only);
+ }
+ else if( use_media_store ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ contentValues.clear();
+ contentValues.put(MediaStore.Images.Media.IS_PENDING, 0);
+ main_activity.getContentResolver().update(saveUri, contentValues, null, null);
+ }
+
+ // no need to broadcast when using mediastore method
+
+ // in theory this is pointless, as announceUri no longer does anything on Android 7+,
+ // and mediastore method is only used on Android 10+, but keep this just in case
+ // announceUri does something in future
+ storageUtils.announceUri(saveUri, true, false);
+
+ if( raw_only ) {
+ // we also want to save the uri - we can use the media uri directly, rather than having to scan it
+ storageUtils.setLastMediaScanned(saveUri, true);
+ }
}
else {
- storageUtils.broadcastUri(saveUri, true, false, false);
+ storageUtils.broadcastUri(saveUri, true, false, raw_only, false);
}
}
catch(FileNotFoundException e) {
@@ -3210,38 +3357,22 @@ public class ImageSaver extends Thread {
return success;
}
- /** Rotates the supplied bitmap according to the orientation tag stored in the exif data. On
- * Android 7 onwards, we use the jpeg data; on earlier versions the supplied exifTimeFile is
- * used. If no rotation is required, the input bitmap is returned.
+ /** Rotates the supplied bitmap according to the orientation tag stored in the exif data. If no
+ * rotation is required, the input bitmap is returned. If rotation is required, the input
+ * bitmap is recycled.
* @param data Jpeg data containing the Exif information to use.
- * @param exifTempFile Ignored on Android 7+. If this is null on older versions, the bitmap is
- * returned without rotation.
*/
- private Bitmap rotateForExif(Bitmap bitmap, byte [] data, File exifTempFile) {
+ private Bitmap rotateForExif(Bitmap bitmap, byte [] data) {
if( MyDebug.LOG )
Log.d(TAG, "rotateForExif");
InputStream inputStream = null;
try {
ExifInterface exif;
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
- if( MyDebug.LOG )
- Log.d(TAG, "Android 7: use data stream to read exif tags");
- inputStream = new ByteArrayInputStream(data);
- exif = new ExifInterface(inputStream);
- }
- else {
- if( MyDebug.LOG )
- Log.d(TAG, "pre-Android 7: use file to read exif tags: " + exifTempFile);
- if( exifTempFile != null ) {
- exif = new ExifInterface(exifTempFile.getAbsolutePath());
- }
- else {
- if( MyDebug.LOG )
- Log.d(TAG, "but no file available to read exif tags from");
- return bitmap;
- }
- }
+ if( MyDebug.LOG )
+ Log.d(TAG, "use data stream to read exif tags");
+ inputStream = new ByteArrayInputStream(data);
+ exif = new ExifInterface(inputStream);
int exif_orientation_s = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_UNDEFINED);
if( MyDebug.LOG )
@@ -3316,36 +3447,75 @@ public class ImageSaver extends Thread {
* supplied EXIF orientation tag.
* @param data The jpeg data.
* @param mutable Whether to create a mutable bitmap.
- * @param exifTempFile Temporary file that can be used to read exif tags (for orientation).
* @return A bitmap representing the correctly rotated jpeg.
*/
- private Bitmap loadBitmapWithRotation(byte [] data, boolean mutable, File exifTempFile) {
+ private Bitmap loadBitmapWithRotation(byte [] data, boolean mutable) {
Bitmap bitmap = loadBitmap(data, mutable, 1);
if( bitmap != null ) {
// rotate the bitmap if necessary for exif tags
if( MyDebug.LOG )
Log.d(TAG, "rotate bitmap for exif tags?");
- bitmap = rotateForExif(bitmap, data, exifTempFile);
+ bitmap = rotateForExif(bitmap, data);
}
return bitmap;
}
+ /* In some cases we may create an ExifInterface with a FileDescriptor obtained from a
+ * ParcelFileDescriptor (via getFileDescriptor()). It's important to keep a reference to the
+ * ParcelFileDescriptor object for as long as the exif interface, otherwise there's a risk of
+ * the ParcelFileDescriptor being garbage collected, invalidating the file descriptor still
+ * being used by the ExifInterface!
+ * This didn't cause any known bugs, but good practice to fix, similar to the issue reported in
+ * https://sourceforge.net/p/opencamera/tickets/417/ .
+ * Also important to call the close() method when done with it, to close the
+ * ParcelFileDescriptor (if one was created).
+ */
+ private static class ExifInterfaceHolder {
+ // see documentation above about keeping hold of pdf due to the garbage collector!
+ private final ParcelFileDescriptor pfd;
+ private final ExifInterface exif;
+
+ ExifInterfaceHolder(ParcelFileDescriptor pfd, ExifInterface exif) {
+ this.pfd = pfd;
+ this.exif = exif;
+ }
+
+ ExifInterface getExif() {
+ return this.exif;
+ }
+
+ void close() {
+ if( this.pfd != null ) {
+ try {
+ this.pfd.close();
+ }
+ catch(IOException e) {
+ Log.e(TAG, "failed to close parcelfiledescriptor");
+ e.printStackTrace();
+ }
+ }
+ }
+ }
+
/** Creates a new exif interface for reading and writing.
- * If picFile==null, then saveUri must be non-null (and the Android version must be Android 7
- * or later), and will be used instead to write the exif tags too.
- * May return null if unable to create the exif interface.
+ * If picFile==null, then saveUri must be non-null, and will be used instead to write the exif
+ * tags too.
+ * The returned ExifInterfaceHolder will always be non-null, but the contained getExif() may
+ * return null if this method was unable to create the exif interface.
+ * The caller should call close() on the returned ExifInterfaceHolder when no longer required.
*/
- private ExifInterface createExifInterface(File picFile, Uri saveUri) throws IOException {
+ private ExifInterfaceHolder createExifInterface(File picFile, Uri saveUri) throws IOException {
+ ParcelFileDescriptor parcelFileDescriptor = null;
ExifInterface exif = null;
if( picFile != null ) {
if( MyDebug.LOG )
Log.d(TAG, "write to picFile: " + picFile);
exif = new ExifInterface(picFile.getAbsolutePath());
}
- else if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
+ else {
if( MyDebug.LOG )
Log.d(TAG, "write direct to saveUri: " + saveUri);
- ParcelFileDescriptor parcelFileDescriptor = main_activity.getContentResolver().openFileDescriptor(saveUri, "rw");
+ parcelFileDescriptor = main_activity.getContentResolver().openFileDescriptor(saveUri, "rw");
if( parcelFileDescriptor != null ) {
FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor();
exif = new ExifInterface(fileDescriptor);
@@ -3354,30 +3524,32 @@ public class ImageSaver extends Thread {
Log.e(TAG, "failed to create ParcelFileDescriptor for saveUri: " + saveUri);
}
}
- else {
- // throw runtimeexception, as this is a programming error
- throw new RuntimeException("picFile==null but Android version is not 7 or later");
- }
- return exif;
+ return new ExifInterfaceHolder(parcelFileDescriptor, exif);
}
/** Makes various modifications to the saved image file, according to the preferences in request.
* This method is used when saving directly from the JPEG data rather than a bitmap.
- * If picFile==null, then saveUri must be non-null (and the Android version must be Android 7
- * or later), and will be used instead to write the exif tags too.
+ * If picFile==null, then saveUri must be non-null, and will be used instead to write the exif
+ * tags too.
*/
private void updateExif(Request request, File picFile, Uri saveUri) throws IOException {
if( MyDebug.LOG )
Log.d(TAG, "updateExif: " + picFile);
- if( request.store_geo_direction || request.store_ypr || hasCustomExif(request.custom_tag_artist, request.custom_tag_copyright) ) {
+ if( request.store_geo_direction || request.store_ypr || hasCustomExif(request.custom_tag_artist, request.custom_tag_copyright) || request.using_camera_extensions ) {
long time_s = System.currentTimeMillis();
if( MyDebug.LOG )
Log.d(TAG, "add additional exif info");
try {
- ExifInterface exif = createExifInterface(picFile, saveUri);
- if( exif != null ) {
- modifyExif(exif, request.type == Request.Type.JPEG, request.using_camera2, request.current_date, request.store_location, request.store_geo_direction, request.geo_direction, request.custom_tag_artist, request.custom_tag_copyright, request.level_angle, request.pitch_angle, request.store_ypr);
- exif.saveAttributes();
+ ExifInterfaceHolder exif_holder = createExifInterface(picFile, saveUri);
+ try {
+ ExifInterface exif = exif_holder.getExif();
+ if( exif != null ) {
+ modifyExif(exif, request.type == Request.Type.JPEG, request.using_camera2, request.using_camera_extensions, request.current_date, request.store_location, request.location, request.store_geo_direction, request.geo_direction, request.custom_tag_artist, request.custom_tag_copyright, request.level_angle, request.pitch_angle, request.store_ypr);
+ exif.saveAttributes();
+ }
+ }
+ finally {
+ exif_holder.close();
}
}
catch(NoClassDefFoundError exception) {
@@ -3393,10 +3565,16 @@ public class ImageSaver extends Thread {
if( MyDebug.LOG )
Log.d(TAG, "remove GPS timestamp hack");
try {
- ExifInterface exif = createExifInterface(picFile, saveUri);
- if( exif != null ) {
- fixGPSTimestamp(exif, request.current_date);
- exif.saveAttributes();
+ ExifInterfaceHolder exif_holder = createExifInterface(picFile, saveUri);
+ try {
+ ExifInterface exif = exif_holder.getExif();
+ if( exif != null ) {
+ fixGPSTimestamp(exif, request.current_date);
+ exif.saveAttributes();
+ }
+ }
+ finally {
+ exif_holder.close();
}
}
catch(NoClassDefFoundError exception) {
@@ -3414,11 +3592,11 @@ public class ImageSaver extends Thread {
/** Makes various modifications to the exif data, if necessary.
*/
- private void modifyExif(ExifInterface exif, boolean is_jpeg, boolean using_camera2, Date current_date, boolean store_location, boolean store_geo_direction, double geo_direction, String custom_tag_artist, String custom_tag_copyright, double level_angle, double pitch_angle, boolean store_ypr) {
+ private void modifyExif(ExifInterface exif, boolean is_jpeg, boolean using_camera2, boolean using_camera_extensions, Date current_date, boolean store_location, Location location, boolean store_geo_direction, double geo_direction, String custom_tag_artist, String custom_tag_copyright, double level_angle, double pitch_angle, boolean store_ypr) {
if( MyDebug.LOG )
Log.d(TAG, "modifyExif");
- setGPSDirectionExif(exif, store_geo_direction, geo_direction, level_angle, pitch_angle, store_ypr);
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && store_ypr ){
+ setGPSDirectionExif(exif, store_geo_direction, geo_direction);
+ if( store_ypr ) {
float geo_angle = (float)Math.toDegrees(geo_direction);
if( geo_angle < 0.0f ) {
geo_angle += 360.0f;
@@ -3430,12 +3608,19 @@ public class ImageSaver extends Thread {
Log.d(TAG, "UserComment: " + exif.getAttribute(ExifInterface.TAG_USER_COMMENT));
}
setCustomExif(exif, custom_tag_artist, custom_tag_copyright);
- if( needGPSTimestampHack(is_jpeg, using_camera2, store_location) ) {
+ if( using_camera_extensions ) {
+ addDateTimeExif(exif, current_date);
+ if( store_location ) {
+ // also need to store geotagging, since Camera API doesn't support doing this for camera extensions
+ exif.setGpsInfo(location);
+ }
+ }
+ else if( needGPSTimestampHack(is_jpeg, using_camera2, store_location) ) {
fixGPSTimestamp(exif, current_date);
}
}
- private void setGPSDirectionExif(ExifInterface exif, boolean store_geo_direction, double geo_direction, double level_angle, double pitch_angle, boolean store_ypr) {
+ private void setGPSDirectionExif(ExifInterface exif, boolean store_geo_direction, double geo_direction) {
if( MyDebug.LOG )
Log.d(TAG, "setGPSDirectionExif");
if( store_geo_direction ) {
@@ -3449,17 +3634,17 @@ public class ImageSaver extends Thread {
String GPSImgDirection_string = Math.round(geo_angle*100) + "/100";
if( MyDebug.LOG )
Log.d(TAG, "GPSImgDirection_string: " + GPSImgDirection_string);
- exif.setAttribute(TAG_GPS_IMG_DIRECTION, GPSImgDirection_string);
- exif.setAttribute(TAG_GPS_IMG_DIRECTION_REF, "M");
+ exif.setAttribute(ExifInterface.TAG_GPS_IMG_DIRECTION, GPSImgDirection_string);
+ exif.setAttribute(ExifInterface.TAG_GPS_IMG_DIRECTION_REF, "M");
}
}
/** Whether custom exif tags need to be applied to the image file.
*/
private boolean hasCustomExif(String custom_tag_artist, String custom_tag_copyright) {
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && custom_tag_artist != null && custom_tag_artist.length() > 0 )
+ if( custom_tag_artist != null && custom_tag_artist.length() > 0 )
return true;
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && custom_tag_copyright != null && custom_tag_copyright.length() > 0 )
+ if( custom_tag_copyright != null && custom_tag_copyright.length() > 0 )
return true;
return false;
}
@@ -3469,12 +3654,12 @@ public class ImageSaver extends Thread {
private void setCustomExif(ExifInterface exif, String custom_tag_artist, String custom_tag_copyright) {
if( MyDebug.LOG )
Log.d(TAG, "setCustomExif");
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && custom_tag_artist != null && custom_tag_artist.length() > 0 ) {
+ if( custom_tag_artist != null && custom_tag_artist.length() > 0 ) {
if( MyDebug.LOG )
Log.d(TAG, "apply TAG_ARTIST: " + custom_tag_artist);
exif.setAttribute(ExifInterface.TAG_ARTIST, custom_tag_artist);
}
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && custom_tag_copyright != null && custom_tag_copyright.length() > 0 ) {
+ if( custom_tag_copyright != null && custom_tag_copyright.length() > 0 ) {
exif.setAttribute(ExifInterface.TAG_COPYRIGHT, custom_tag_copyright);
if( MyDebug.LOG )
Log.d(TAG, "apply TAG_COPYRIGHT: " + custom_tag_copyright);
@@ -3495,8 +3680,44 @@ public class ImageSaver extends Thread {
if( exif_datetime != null ) {
if( MyDebug.LOG )
Log.d(TAG, "write datetime tags: " + exif_datetime);
- exif.setAttribute(TAG_DATETIME_ORIGINAL, exif_datetime);
- exif.setAttribute(TAG_DATETIME_DIGITIZED, exif_datetime);
+ exif.setAttribute(ExifInterface.TAG_DATETIME_ORIGINAL, exif_datetime);
+ exif.setAttribute(ExifInterface.TAG_DATETIME_DIGITIZED, exif_datetime);
+ }
+ }
+
+ /** Adds exif tags for datetime from the supplied date, if not present. Needed for camera vendor
+ * extensions which (at least on Galaxy S10e) don't seem to have these tags set at all!
+ */
+ private void addDateTimeExif(ExifInterface exif, Date current_date) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "addDateTimeExif");
+ String exif_datetime = exif.getAttribute(ExifInterface.TAG_DATETIME);
+ if( MyDebug.LOG )
+ Log.d(TAG, "existing exif TAG_DATETIME: " + exif_datetime);
+ if( exif_datetime == null ) {
+ SimpleDateFormat date_fmt = new SimpleDateFormat("yyyy:MM:dd HH:mm:ss", Locale.US);
+ date_fmt.setTimeZone(TimeZone.getDefault()); // need local timezone for TAG_DATETIME
+ exif_datetime = date_fmt.format(current_date);
+ if( MyDebug.LOG )
+ Log.d(TAG, "new TAG_DATETIME: " + exif_datetime);
+
+ exif.setAttribute(ExifInterface.TAG_DATETIME, exif_datetime);
+ // set these tags too (even if already present, overwrite to be consistent)
+ exif.setAttribute(ExifInterface.TAG_DATETIME_ORIGINAL, exif_datetime);
+ exif.setAttribute(ExifInterface.TAG_DATETIME_DIGITIZED, exif_datetime);
+
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
+ // XXX requires Android 7
+ // needs to be -/+HH:mm format, which is given by XXX
+ date_fmt = new SimpleDateFormat("XXX", Locale.US);
+ date_fmt.setTimeZone(TimeZone.getDefault());
+ String timezone = date_fmt.format(current_date);
+ if( MyDebug.LOG )
+ Log.d(TAG, "timezone: " + timezone);
+ exif.setAttribute(ExifInterface.TAG_OFFSET_TIME, timezone);
+ exif.setAttribute(ExifInterface.TAG_OFFSET_TIME_ORIGINAL, timezone);
+ exif.setAttribute(ExifInterface.TAG_OFFSET_TIME_DIGITIZED, timezone);
+ }
}
}
@@ -3517,7 +3738,7 @@ public class ImageSaver extends Thread {
// So now hopefully fixed properly...
// Note, this problem also occurs on OnePlus 3T and Gallery ICS, if we don't have this function called
SimpleDateFormat date_fmt = new SimpleDateFormat("yyyy:MM:dd", Locale.US);
- date_fmt.setTimeZone(TimeZone.getTimeZone("UTC")); // needs to be UTC time
+ date_fmt.setTimeZone(TimeZone.getTimeZone("UTC")); // needs to be UTC time for the GPS datetime tags
String datestamp = date_fmt.format(current_date);
SimpleDateFormat time_fmt = new SimpleDateFormat("HH:mm:ss", Locale.US);
@@ -3542,36 +3763,6 @@ public class ImageSaver extends Thread {
return false;
}
- /** Reads from picFile and writes the contents to saveUri.
- */
- private void copyFileToUri(Context context, Uri saveUri, File picFile) throws IOException {
- if( MyDebug.LOG ) {
- Log.d(TAG, "copyFileToUri");
- Log.d(TAG, "saveUri: " + saveUri);
- Log.d(TAG, "picFile: " + saveUri);
- }
- InputStream inputStream = null;
- OutputStream realOutputStream = null;
- try {
- inputStream = new FileInputStream(picFile);
- realOutputStream = context.getContentResolver().openOutputStream(saveUri);
- // Transfer bytes from in to out
- byte [] buffer = new byte[1024];
- int len;
- while( (len = inputStream.read(buffer)) > 0 ) {
- realOutputStream.write(buffer, 0, len);
- }
- }
- finally {
- if( inputStream != null ) {
- inputStream.close();
- }
- if( realOutputStream != null ) {
- realOutputStream.close();
- }
- }
- }
-
// for testing:
HDRProcessor getHDRProcessor() {
diff --git a/app/src/main/java/net/sourceforge/opencamera/LocationSupplier.java b/app/src/main/java/net/sourceforge/opencamera/LocationSupplier.java
index 3ecebb605f348b49e6ecc3c02370998b14ab3591..f998ef7afc2b29ba2a4e2838c369ca0841768048 100644
--- a/app/src/main/java/net/sourceforge/opencamera/LocationSupplier.java
+++ b/app/src/main/java/net/sourceforge/opencamera/LocationSupplier.java
@@ -11,6 +11,8 @@ import android.location.LocationProvider;
import android.os.Build;
import android.os.Bundle;
import android.preference.PreferenceManager;
+
+import androidx.annotation.NonNull;
import androidx.core.content.ContextCompat;
import android.util.Log;
@@ -74,12 +76,16 @@ public class LocationSupplier {
}
}
+ /** If adding extra calls to this, consider whether explicit user permission is required, and whether
+ * privacy policy or data privacy section needs updating.
+ * @return Returns null if location not available.
+ */
public Location getLocation() {
return getLocation(null);
}
/** If adding extra calls to this, consider whether explicit user permission is required, and whether
- * privacy policy needs updating.
+ * privacy policy or data privacy section needs updating.
* @param locationInfo Optional class to return additional information about the location.
* @return Returns null if location not available.
*/
@@ -115,7 +121,7 @@ public class LocationSupplier {
return location;
}
- public void onLocationChanged(Location location) {
+ public void onLocationChanged(@NonNull Location location) {
if( MyDebug.LOG )
Log.d(TAG, "onLocationChanged");
this.test_has_received_location = true;
@@ -123,8 +129,8 @@ public class LocationSupplier {
// also check for not being null just in case - had a nullpointerexception on Google Play!
if( location != null && ( location.getLatitude() != 0.0d || location.getLongitude() != 0.0d ) ) {
if( MyDebug.LOG ) {
- Log.d(TAG, "received location:");
- Log.d(TAG, "lat " + location.getLatitude() + " long " + location.getLongitude() + " accuracy " + location.getAccuracy());
+ Log.d(TAG, "received location");
+ // don't log location, in case of privacy!
}
this.location = location;
cacheLocation();
@@ -152,10 +158,10 @@ public class LocationSupplier {
}
}
- public void onProviderEnabled(String provider) {
+ public void onProviderEnabled(@NonNull String provider) {
}
- public void onProviderDisabled(String provider) {
+ public void onProviderDisabled(@NonNull String provider) {
if( MyDebug.LOG )
Log.d(TAG, "onProviderDisabled");
this.location = null;
@@ -166,6 +172,8 @@ public class LocationSupplier {
/* Best to only call this from MainActivity.initLocation().
* @return Returns false if location permission not available for either coarse or fine.
+ * Important to only return false if we actually want/need to ask the user for location
+ * permission!
*/
boolean setupLocationListener() {
if( MyDebug.LOG )
@@ -180,23 +188,34 @@ public class LocationSupplier {
// the user. However on Galaxy Nexus Android 4.3 and Nexus 7 (2013) Android 5.1.1, ACCESS_COARSE_LOCATION returns
// PERMISSION_DENIED! So we keep the checks to Android Marshmallow or later (where we need them), and avoid
// checking behaviour for earlier devices.
+ boolean has_coarse_location_permission;
+ boolean has_fine_location_permission;
if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
if( MyDebug.LOG )
Log.d(TAG, "check for location permissions");
- boolean has_coarse_location_permission = ContextCompat.checkSelfPermission(context, Manifest.permission.ACCESS_COARSE_LOCATION) == PackageManager.PERMISSION_GRANTED;
- boolean has_fine_location_permission = ContextCompat.checkSelfPermission(context, Manifest.permission.ACCESS_FINE_LOCATION) == PackageManager.PERMISSION_GRANTED;
+ has_coarse_location_permission = ContextCompat.checkSelfPermission(context, Manifest.permission.ACCESS_COARSE_LOCATION) == PackageManager.PERMISSION_GRANTED;
+ has_fine_location_permission = ContextCompat.checkSelfPermission(context, Manifest.permission.ACCESS_FINE_LOCATION) == PackageManager.PERMISSION_GRANTED;
if( MyDebug.LOG ) {
Log.d(TAG, "has_coarse_location_permission? " + has_coarse_location_permission);
Log.d(TAG, "has_fine_location_permission? " + has_fine_location_permission);
}
- // require both permissions to be present
- if( !has_coarse_location_permission || !has_fine_location_permission ) {
+ //has_coarse_location_permission = false; // test
+ //has_fine_location_permission = false; // test
+ // require at least one permission to be present
+ // will be important for Android 12+ where user can grant only coarse permission - we still
+ // want to support geotagging in such cases
+ if( !has_coarse_location_permission && !has_fine_location_permission ) {
if( MyDebug.LOG )
Log.d(TAG, "location permission not available");
// return false, which tells caller to request permission - we'll call this function again if permission is granted
return false;
}
}
+ else {
+ // permissions always available pre-Android 6
+ has_coarse_location_permission = true;
+ has_fine_location_permission = true;
+ }
locationListeners = new MyLocationListener[2];
locationListeners[0] = new MyLocationListener();
@@ -205,28 +224,31 @@ public class LocationSupplier {
// location listeners should be stored in order best to worst
// also see https://sourceforge.net/p/opencamera/tickets/1/ - need to check provider is available
// now also need to check for permissions - need to support devices that might have one but not both of fine and coarse permissions supplied
- if( locationManager.getAllProviders().contains(LocationManager.NETWORK_PROVIDER) ) {
+ if( has_coarse_location_permission && locationManager.getAllProviders().contains(LocationManager.NETWORK_PROVIDER) ) {
locationManager.requestLocationUpdates(LocationManager.NETWORK_PROVIDER, 1000, 0, locationListeners[1]);
if( MyDebug.LOG )
Log.d(TAG, "created coarse (network) location listener");
}
else {
if( MyDebug.LOG )
- Log.e(TAG, "don't have a NETWORK_PROVIDER");
+ Log.d(TAG, "don't have a NETWORK_PROVIDER");
}
- if( locationManager.getAllProviders().contains(LocationManager.GPS_PROVIDER) ) {
+ if( has_fine_location_permission && locationManager.getAllProviders().contains(LocationManager.GPS_PROVIDER) ) {
locationManager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 1000, 0, locationListeners[0]);
if( MyDebug.LOG )
Log.d(TAG, "created fine (gps) location listener");
}
else {
if( MyDebug.LOG )
- Log.e(TAG, "don't have a GPS_PROVIDER");
+ Log.d(TAG, "don't have a GPS_PROVIDER");
}
}
else if( !store_location ) {
freeLocationListeners();
}
+ // important to return true even if we didn't set up decide the location listeners - as
+ // returning false indicates to ask user for location permission (which we don't want to
+ // do if PreferenceKeys.LocationPreferenceKey preference isn't true)
return true;
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/MainActivity.java b/app/src/main/java/net/sourceforge/opencamera/MainActivity.java
index 1c114f63a1ad7c1cda29eeab2512f0b67156db67..cf2f527a7a874c8ea740b218fdd6dccf350151ea 100644
--- a/app/src/main/java/net/sourceforge/opencamera/MainActivity.java
+++ b/app/src/main/java/net/sourceforge/opencamera/MainActivity.java
@@ -1,5 +1,29 @@
package net.sourceforge.opencamera;
+import net.sourceforge.opencamera.cameracontroller.CameraController;
+import net.sourceforge.opencamera.cameracontroller.CameraControllerManager;
+import net.sourceforge.opencamera.cameracontroller.CameraControllerManager2;
+import net.sourceforge.opencamera.preview.Preview;
+import net.sourceforge.opencamera.preview.VideoProfile;
+import net.sourceforge.opencamera.remotecontrol.BluetoothRemoteControl;
+import net.sourceforge.opencamera.ui.FolderChooserDialog;
+import net.sourceforge.opencamera.ui.MainUI;
+import net.sourceforge.opencamera.ui.ManualSeekbars;
+
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.text.DecimalFormat;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Hashtable;
+import java.util.List;
+//import java.util.Locale;
+import java.util.Map;
+import java.util.concurrent.ExecutorService;
+import java.util.concurrent.Executors;
+import java.util.concurrent.Future;
+
import android.Manifest;
import android.animation.ArgbEvaluator;
import android.animation.ValueAnimator;
@@ -34,24 +58,53 @@ import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
+import android.hardware.display.DisplayManager;
+import android.media.MediaMetadataRetriever;
import android.net.Uri;
-import android.os.AsyncTask;
import android.os.Build;
import android.os.Bundle;
import android.os.Handler;
+import android.os.Looper;
import android.os.ParcelFileDescriptor;
import android.preference.PreferenceManager;
import android.provider.MediaStore;
+import android.animation.ArgbEvaluator;
+import android.animation.ValueAnimator;
+import android.annotation.SuppressLint;
+import android.annotation.TargetApi;
+import android.app.ActivityManager;
+import android.app.AlertDialog;
+import android.app.KeyguardManager;
+import android.content.ActivityNotFoundException;
+import android.content.ContentResolver;
+import android.content.Context;
+import android.content.DialogInterface;
+import android.content.Intent;
+import android.content.SharedPreferences;
+import android.content.pm.ActivityInfo;
+import android.content.pm.PackageManager;
+import android.content.res.Configuration;
import android.renderscript.RenderScript;
import android.speech.tts.TextToSpeech;
+import androidx.annotation.NonNull;
+import androidx.annotation.RequiresApi;
+import androidx.appcompat.app.AppCompatActivity;
+import androidx.core.content.ContextCompat;
+import androidx.exifinterface.media.ExifInterface;
+
+import android.text.InputFilter;
+import android.text.InputType;
+import android.text.Spanned;
import android.util.Log;
import android.view.Display;
import android.view.GestureDetector;
import android.view.GestureDetector.SimpleOnGestureListener;
import android.view.KeyEvent;
+import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MotionEvent;
import android.view.OrientationEventListener;
+import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewConfiguration;
@@ -96,8 +149,11 @@ import foundation.e.camera.R;
/**
* The main Activity for Open Camera.
+import androidx.appcompat.app.AppCompatActivity;
+
+/** The main Activity for Open Camera.
*/
-public class MainActivity extends Activity {
+public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
private static int activity_count = 0;
@@ -117,7 +173,7 @@ public class MainActivity extends Activity {
private TextFormatter textFormatter;
private SoundPoolManager soundPoolManager;
private MagneticSensor magneticSensor;
- private SpeechControl speechControl;
+ //private SpeechControl speechControl;
private Preview preview;
private OrientationEventListener orientationEventListener;
@@ -135,6 +191,7 @@ public class MainActivity extends Activity {
private ValueAnimator gallery_save_anim;
private boolean last_continuous_fast_burst; // whether the last photo operation was a continuous_fast_burst
private boolean should_run_continuous_fast_burst = false;
+ private Future> update_gallery_future;
private TextToSpeech textToSpeech;
private boolean textToSpeechSuccess;
@@ -167,6 +224,7 @@ public class MainActivity extends Activity {
private final ToastBoxer white_balance_lock_toast = new ToastBoxer();
private final ToastBoxer exposure_lock_toast = new ToastBoxer();
private final ToastBoxer audio_control_toast = new ToastBoxer();
+ private final ToastBoxer store_location_toast = new ToastBoxer();
private boolean block_startup_toast = false; // used when returning from Settings/Popup - if we're displaying a toast anyway, don't want to display the info toast too
private String push_info_toast_text; // can be used to "push" extra text to the info text for showPhotoVideoToast()
@@ -189,7 +247,8 @@ public class MainActivity extends Activity {
public volatile boolean test_low_memory;
public volatile boolean test_have_angle;
public volatile float test_angle;
- public volatile String test_last_saved_image;
+ public volatile Uri test_last_saved_imageuri; // uri of last image; set if using scoped storage OR using SAF
+ public volatile String test_last_saved_image; // filename (including full path) of last image; set if not using scoped storage nor using SAF (i.e., writing using File API)
public static boolean test_force_supports_camera2; // okay to be static, as this is set for an entire test suite
public volatile String test_save_settings_file;
@@ -202,6 +261,30 @@ public class MainActivity extends Activity {
private float mWaterDensity = 1.0f;
private ImageButton switchVideoButton;
+ // whether to lock to landscape orientation, or allow switching between portrait and landscape orientations
+ //public static final boolean lock_to_landscape = true;
+ public static final boolean lock_to_landscape = false;
+
+ // handling for lock_to_landscape==false:
+
+ public enum SystemOrientation {
+ LANDSCAPE,
+ PORTRAIT,
+ REVERSE_LANDSCAPE
+ }
+
+ private MyDisplayListener displayListener;
+
+ private boolean has_cached_system_orientation;
+ private SystemOrientation cached_system_orientation;
+
+ private boolean hasOldSystemOrientation;
+ private SystemOrientation oldSystemOrientation;
+
+ private boolean has_cached_display_rotation;
+ private long cached_display_rotation_time_ms;
+ private int cached_display_rotation;
+
@Override
protected void onCreate(Bundle savedInstanceState) {
long debug_time = 0;
@@ -214,13 +297,6 @@ public class MainActivity extends Activity {
Log.d(TAG, "activity_count: " + activity_count);
super.onCreate(savedInstanceState);
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2 ) {
- // don't show orientation animations
- WindowManager.LayoutParams layout = getWindow().getAttributes();
- layout.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE;
- getWindow().setAttributes(layout);
- }
-
setContentView(R.layout.activity_main);
PreferenceManager.setDefaultValues(this, R.xml.preferences, false); // initialise any unset preferences to their default values
@@ -283,7 +359,7 @@ public class MainActivity extends Activity {
textFormatter = new TextFormatter(this);
soundPoolManager = new SoundPoolManager(this);
magneticSensor = new MagneticSensor(this);
- speechControl = new SpeechControl(this);
+ //speechControl = new SpeechControl(this);
// determine whether we support Camera2 API
initCamera2Support();
@@ -293,11 +369,12 @@ public class MainActivity extends Activity {
if( MyDebug.LOG )
Log.d(TAG, "onCreate: time after setting window flags: " + (System.currentTimeMillis() - debug_time));
- save_location_history = new SaveLocationHistory(this, "save_location_history", getStorageUtils().getSaveLocation());
+ save_location_history = new SaveLocationHistory(this, PreferenceKeys.SaveLocationHistoryBasePreferenceKey, getStorageUtils().getSaveLocation());
+ checkSaveLocations();
if( applicationInterface.getStorageUtils().isUsingSAF() ) {
if( MyDebug.LOG )
Log.d(TAG, "create new SaveLocationHistory for SAF");
- save_location_history_saf = new SaveLocationHistory(this, "save_location_history_saf", getStorageUtils().getSaveLocationSAF());
+ save_location_history_saf = new SaveLocationHistory(this, PreferenceKeys.SaveLocationHistorySAFBasePreferenceKey, getStorageUtils().getSaveLocationSAF());
}
if( MyDebug.LOG )
Log.d(TAG, "onCreate: time after updating folder history: " + (System.currentTimeMillis() - debug_time));
@@ -327,10 +404,30 @@ public class MainActivity extends Activity {
mainUI.closeExposureUI();
// set up the camera and its preview
- preview = new Preview(applicationInterface, ((ViewGroup) this.findViewById(R.id.preview)));
+ preview = new Preview(applicationInterface, (this.findViewById(R.id.preview)));
if( MyDebug.LOG )
Log.d(TAG, "onCreate: time after creating preview: " + (System.currentTimeMillis() - debug_time));
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2 ) {
+ // don't show orientation animations
+ // must be done after creating Preview (so we know if Camera2 API or not)
+ WindowManager.LayoutParams layout = getWindow().getAttributes();
+ // If locked to landscape, ROTATION_ANIMATION_SEAMLESS/JUMPCUT has the problem that when going to
+ // Settings in portrait, we briefly see the UI change - this is because we set the flag
+ // to no longer lock to landscape, and that change happens too quickly.
+ // This isn't a problem when lock_to_landscape==false, and we want
+ // ROTATION_ANIMATION_SEAMLESS so that there is no/minimal pause from the preview when
+ // rotating the device. However if using old camera API, we get an ugly transition with
+ // ROTATION_ANIMATION_SEAMLESS (probably related to not using TextureView?)
+ if( lock_to_landscape || !preview.usingCamera2API() )
+ layout.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE;
+ else if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.O )
+ layout.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_SEAMLESS;
+ else
+ layout.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_JUMPCUT;
+ getWindow().setAttributes(layout);
+ }
+
// Setup multi-camera buttons (must be done after creating preview so we know which Camera API is being used,
// and before initialising on-screen visibility).
// We only allow the separate icon for switching cameras if:
@@ -516,7 +613,7 @@ public class MainActivity extends Activity {
});
}
if( MyDebug.LOG )
- Log.d(TAG, "onCreate: time after setting immersive mode listener: " + (System.currentTimeMillis() - debug_time));
+ Log.d(TAG, "onCreate: time after setting system ui visibility listener: " + (System.currentTimeMillis() - debug_time));
// show "about" dialog for first time use; also set some per-device defaults
boolean has_done_first_time = sharedPreferences.contains(PreferenceKeys.FirstTimePreferenceKey);
@@ -553,7 +650,7 @@ public class MainActivity extends Activity {
// E.g., we have a "What's New" for 1.44 (64), but then push out a quick fix for 1.44.1 (65). We don't want to
// show the dialog again to people who already received 1.44 (64), but we still want to show the dialog to people
// upgrading from earlier versions.
- int whats_new_version = 75; // 1.48
+ int whats_new_version = 84; // 1.50
whats_new_version = Math.min(whats_new_version, version_code); // whats_new_version should always be <= version_code, but just in case!
if( MyDebug.LOG ) {
Log.d(TAG, "whats_new_version: " + whats_new_version);
@@ -683,12 +780,20 @@ public class MainActivity extends Activity {
* if sdk>=R & can't draw navigation bar, means the DecorFitsSystemWindows=true, then we can ignore navigationGap.
*/
public int getNavigationGap() {
- if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.R && !can_draw_nav_bar) {
+ if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R && !can_draw_nav_bar) {
return 0;
}
return want_no_limits ? navigation_gap : 0;
}
+ /** Whether to use codepaths that are compatible with scoped storage.
+ */
+ public static boolean useScopedStorage() {
+ //return false;
+ //return true;
+ return Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q;
+ }
+
/** Whether this is a multi camera device, and the user preference is set to enable the multi-camera button.
*/
public boolean isMultiCamEnabled() {
@@ -758,35 +863,32 @@ public class MainActivity extends Activity {
void setDeviceDefaults() {
if( MyDebug.LOG )
Log.d(TAG, "setDeviceDefaults");
- SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
- boolean is_samsung = Build.MANUFACTURER.toLowerCase(Locale.US).contains("samsung");
- boolean is_oneplus = Build.MANUFACTURER.toLowerCase(Locale.US).contains("oneplus");
+ //SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
+ //boolean is_samsung = Build.MANUFACTURER.toLowerCase(Locale.US).contains("samsung");
+ //boolean is_oneplus = Build.MANUFACTURER.toLowerCase(Locale.US).contains("oneplus");
//boolean is_nexus = Build.MODEL.toLowerCase(Locale.US).contains("nexus");
//boolean is_nexus6 = Build.MODEL.toLowerCase(Locale.US).contains("nexus 6");
//boolean is_pixel_phone = Build.DEVICE != null && Build.DEVICE.equals("sailfish");
//boolean is_pixel_xl_phone = Build.DEVICE != null && Build.DEVICE.equals("marlin");
- if( MyDebug.LOG ) {
- Log.d(TAG, "is_samsung? " + is_samsung);
- Log.d(TAG, "is_oneplus? " + is_oneplus);
+ /*if( MyDebug.LOG ) {
+ //Log.d(TAG, "is_samsung? " + is_samsung);
+ //Log.d(TAG, "is_oneplus? " + is_oneplus);
//Log.d(TAG, "is_nexus? " + is_nexus);
//Log.d(TAG, "is_nexus6? " + is_nexus6);
//Log.d(TAG, "is_pixel_phone? " + is_pixel_phone);
//Log.d(TAG, "is_pixel_xl_phone? " + is_pixel_xl_phone);
- }
- if( is_samsung || is_oneplus ) {
- // workaround needed for Samsung Galaxy S7 at least (tested on Samsung RTL)
- // workaround needed for OnePlus 3 at least (see http://forum.xda-developers.com/oneplus-3/help/camera2-support-t3453103 )
- // update for v1.37: significant improvements have been made for standard flash and Camera2 API. But OnePlus 3T still has problem
- // that photos come out with a blue tinge if flash is on, and the scene is bright enough not to need it; Samsung devices also seem
- // to work okay, testing on S7 on RTL, but still keeping the fake flash mode in place for these devices, until we're sure of good
- // behaviour
- // update for testing on Galaxy S10e: still needs fake flash
+ }*/
+ /*if( is_samsung || is_oneplus ) {
+ // The problems we used to have on Samsung Galaxy devices are now fixed, by setting
+ // TEMPLATE_PREVIEW for the precaptureBuilder in CameraController2. This also fixes the
+ // problems with OnePlus 3T having blue tinge if flash is on, and the scene is bright
+ // enough not to need it
if( MyDebug.LOG )
Log.d(TAG, "set fake flash for camera2");
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putBoolean(PreferenceKeys.Camera2FakeFlashPreferenceKey, true);
editor.apply();
- }
+ }*/
/*if( is_nexus6 ) {
// Nexus 6 captureBurst() started having problems with Android 7 upgrade - images appeared in wrong order (and with wrong order of shutter speeds in exif info), as well as problems with the camera failing with serious errors
// we set this even for Nexus 6 devices not on Android 7, as at some point they'll likely be upgraded to Android 7
@@ -957,6 +1059,154 @@ public class MainActivity extends Activity {
}
}
+ /** Handles users updating to a version with scoped storage (this could be Android 10 users upgrading
+ * to the version of Open Camera with scoped storage; or users who later upgrade to Android 10).
+ * With scoped storage, we no longer support saving outside of DCIM/ when not using SAF.
+ * This updates if necessary both the current save location, and the save folder history.
+ */
+ private void checkSaveLocations() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "checkSaveLocations");
+ if( useScopedStorage() ) {
+ SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
+ boolean any_changes = false;
+ String save_location = getStorageUtils().getSaveLocation();
+ CheckSaveLocationResult res = checkSaveLocation(save_location);
+ if( !res.res ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "save_location not valid with scoped storage: " + save_location);
+ String new_folder;
+ if( res.alt == null ) {
+ // no alternative, fall back to default
+ new_folder = "OpenCamera";
+ }
+ else {
+ // replace with the alternative
+ if( MyDebug.LOG )
+ Log.d(TAG, "alternative: " + res.alt);
+ new_folder = res.alt;
+ }
+ SharedPreferences.Editor editor = sharedPreferences.edit();
+ editor.putString(PreferenceKeys.SaveLocationPreferenceKey, new_folder);
+ editor.apply();
+ any_changes = true;
+ }
+
+ // now check history
+ // go backwards so we can remove easily
+ for(int i=save_location_history.size()-1;i>=0;i--) {
+ String this_location = save_location_history.get(i);
+ res = checkSaveLocation(this_location);
+ if( !res.res ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "save_location in history " + i + " not valid with scoped storage: " + this_location);
+ if( res.alt == null ) {
+ // no alternative, remove
+ save_location_history.remove(i);
+ }
+ else {
+ // replace with the alternative
+ if( MyDebug.LOG )
+ Log.d(TAG, "alternative: " + res.alt);
+ save_location_history.set(i, res.alt);
+ }
+ any_changes = true;
+ }
+ }
+
+ if( any_changes ) {
+ this.save_location_history.updateFolderHistory(this.getStorageUtils().getSaveLocation(), false);
+ }
+ }
+ }
+
+ /** Result from checkSaveLocation. Ideally we'd just use android.util.Pair, but that's not mocked
+ * for use in unit tests.
+ * See checkSaveLocation() for documentation.
+ */
+ public static class CheckSaveLocationResult {
+ final boolean res;
+ final String alt;
+
+ public CheckSaveLocationResult(boolean res, String alt) {
+ this.res = res;
+ this.alt = alt;
+ }
+
+ @Override
+ public boolean equals(Object o) {
+ if( !(o instanceof CheckSaveLocationResult) ) {
+ return false;
+ }
+ CheckSaveLocationResult that = (CheckSaveLocationResult)o;
+ // stop dumb inspection that suggests replacing warning with an error(!) (Objects class is not available on all API versions)
+ // and the other inspection suggests replacing with code that would cause a nullpointerexception
+ //noinspection EqualsReplaceableByObjectsCall,StringEquality
+ return that.res == this.res && ( (that.alt == this.alt) || (that.alt != null && that.alt.equals(this.alt) ) );
+ //return that.res == this.res && ( (that.alt == this.alt) || (that.alt != null && that.alt.equals(this.alt) ) );
+ }
+
+ @Override
+ public int hashCode() {
+ return (res ? 1249 : 1259) ^ (alt == null ? 0 : alt.hashCode());
+ }
+
+ @NonNull
+ @Override
+ public String toString() {
+ return "CheckSaveLocationResult{" + res + " , " + alt + "}";
+ }
+ }
+
+ public static CheckSaveLocationResult checkSaveLocation(final String folder) {
+ return checkSaveLocation(folder, null);
+ }
+
+ /** Checks to see if the supplied folder (in the format as used by our preferences) is supported
+ * with scoped storage.
+ * @return The Boolean is always non-null, and returns whether the save location is valid.
+ * If the return is false, then if the String is non-null, this stores an alternative
+ * form that is valid. If null, there is no valid alternative.
+ * @param base_folder This should normally be null, but can be used to specify manually the
+ * folder instead of using StorageUtils.getBaseFolder() - needed for unit
+ * tests as Environment class (for Environment.getExternalStoragePublicDirectory())
+ * is not mocked.
+ */
+ public static CheckSaveLocationResult checkSaveLocation(final String folder, String base_folder) {
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "DCIM path: " + Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM).getAbsolutePath());*/
+ if( StorageUtils.saveFolderIsFull(folder) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "checkSaveLocation for full path: " + folder);
+ // But still check to see if the full path is part of DCIM. Since when using the
+ // file dialog method with non-scoped storage, if the user specifies multiple subfolders
+ // e.g. DCIM/blah_a/blah_b, we don't spot that in FolderChooserDialog.useFolder(), and
+ // instead still store that as the full path.
+
+ if( base_folder == null )
+ base_folder = StorageUtils.getBaseFolder().getAbsolutePath();
+ // strip '/' as last character - makes it easier to also spot cases where the folder is the
+ // DCIM folder, but doesn't have a '/' last character
+ if( base_folder.length() >= 1 && base_folder.charAt(base_folder.length()-1) == '/' )
+ base_folder = base_folder.substring(0, base_folder.length()-1);
+ if( MyDebug.LOG )
+ Log.d(TAG, " compare to base_folder: " + base_folder);
+ String alt_folder = null;
+ if( folder.startsWith(base_folder) ) {
+ alt_folder = folder.substring(base_folder.length());
+ // also need to strip the first '/' if it exists
+ if( alt_folder.length() >= 1 && alt_folder.charAt(0) == '/' )
+ alt_folder = alt_folder.substring(1);
+ }
+
+ return new CheckSaveLocationResult(false, alt_folder);
+ }
+ else {
+ // already in expected format (indicates a sub-folder of DCIM)
+ return new CheckSaveLocationResult(true, null);
+ }
+ }
+
private void preloadIcons(int icons_id) {
long debug_time = 0;
if( MyDebug.LOG ) {
@@ -1165,20 +1415,51 @@ public class MainActivity extends Activity {
return super.onKeyUp(keyCode, event);
}
+ private void zoomByStep(int change) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "zoomByStep: " + change);
+ if( preview.supportsZoom() && change != 0 ) {
+ if( preview.getCameraController() != null ) {
+ // If the minimum zoom is < 1.0, the seekbar will have repeated entries for 1x zoom
+ // (so it's easier for the user to zoom to exactly 1.0x). But if using the -/+ buttons,
+ // volume keys etc to zoom, we want to skip over these repeated values.
+ int zoom_factor = preview.getCameraController().getZoom();
+ int new_zoom_factor = zoom_factor + change;
+ if( MyDebug.LOG )
+ Log.d(TAG, "new_zoom_factor: " + new_zoom_factor);
+ while( new_zoom_factor > 0 && new_zoom_factor < preview.getMaxZoom() && preview.getZoomRatio(new_zoom_factor) == preview.getZoomRatio() ) {
+ if( change > 0 )
+ change++;
+ else
+ change--;
+ new_zoom_factor = zoom_factor + change;
+ if( MyDebug.LOG )
+ Log.d(TAG, "skip over constant region: " + new_zoom_factor);
+ }
+ }
+
+ mainUI.changeSeekbar(R.id.zoom_seekbar, -change); // seekbar is opposite direction to zoom array
+ }
+ }
+
public void zoomIn() {
- mainUI.changeSeekbar(R.id.zoom_seekbar, 1);
+ zoomByStep(-1);
}
public void zoomOut() {
- mainUI.changeSeekbar(R.id.zoom_seekbar, -1);
+ zoomByStep(1);
}
public void changeExposure(int change) {
- mainUI.changeSeekbar(R.id.exposure_seekbar, change);
+ if( preview.supportsExposures() ) {
+ mainUI.changeSeekbar(R.id.exposure_seekbar, change);
+ }
}
public void changeISO(int change) {
- mainUI.changeSeekbar(R.id.iso_seekbar, change);
+ if( preview.supportsISORange() ) {
+ mainUI.changeSeekbar(R.id.iso_seekbar, change);
+ }
}
public void changeFocusDistance(int change, boolean is_target_distance) {
@@ -1196,19 +1477,6 @@ public class MainActivity extends Activity {
}
};
- /* To support https://play.google.com/store/apps/details?id=com.miband2.mibandselfie .
- * Allows using the Mi Band 2 as a Bluetooth remote for Open Camera to take photos or start/stop
- * videos.
- */
- private final BroadcastReceiver cameraReceiver = new BroadcastReceiver() {
- @Override
- public void onReceive(Context context, Intent intent) {
- if( MyDebug.LOG )
- Log.d(TAG, "cameraReceiver.onReceive");
- MainActivity.this.takePicture(false);
- }
- };
-
public float getWaterDensity() {
return this.mWaterDensity;
}
@@ -1223,35 +1491,44 @@ public class MainActivity extends Activity {
super.onResume();
this.app_is_paused = false; // must be set before initLocation() at least
+ // this is intentionally true, not false, as the uncovering happens in DrawPreview when we receive frames from the camera after it's opened
+ // (this should already have been set from the call in onPause(), but we set it here again just in case)
+ applicationInterface.getDrawPreview().setCoverPreview(true);
+
cancelImageSavingNotification();
// Set black window background; also needed if we hide the virtual buttons in immersive mode
// Note that we do it here rather than customising the theme's android:windowBackground, so this doesn't affect other views - in particular, the MyPreferenceFragment settings
getWindow().getDecorView().getRootView().setBackgroundColor(Color.BLACK);
+ registerDisplayListener();
+
mSensorManager.registerListener(accelerometerListener, mSensorAccelerometer, SensorManager.SENSOR_DELAY_NORMAL);
magneticSensor.registerMagneticListener(mSensorManager);
orientationEventListener.enable();
- registerReceiver(cameraReceiver, new IntentFilter("com.miband2.action.CAMERA"));
-
// if BLE remote control is enabled, then start the background BLE service
bluetoothRemoteControl.startRemoteControl();
- speechControl.initSpeechRecognizer();
+ //speechControl.initSpeechRecognizer();
initLocation();
initGyroSensors();
+ applicationInterface.getImageSaver().onResume();
soundPoolManager.initSound();
soundPoolManager.loadSound(R.raw.mybeep);
soundPoolManager.loadSound(R.raw.mybeep_hi);
+ resetCachedSystemOrientation(); // just in case?
mainUI.layoutUI();
updateGalleryIcon(); // update in case images deleted whilst idle
applicationInterface.reset(false); // should be called before opening the camera in preview.onResume()
- preview.onResume();
+ if( !camera_in_background ) {
+ // don't restart camera if we're showing a dialog or settings
+ preview.onResume();
+ }
{
// show a toast for the camera if it's not the first for front of back facing (otherwise on multi-front/back camera
@@ -1312,31 +1589,31 @@ public class MainActivity extends Activity {
this.app_is_paused = true;
mainUI.destroyPopup(); // important as user could change/reset settings from Android settings when pausing
+ unregisterDisplayListener();
mSensorManager.unregisterListener(accelerometerListener);
magneticSensor.unregisterMagneticListener(mSensorManager);
orientationEventListener.disable();
- try {
- unregisterReceiver(cameraReceiver);
- }
- catch(IllegalArgumentException e) {
- // this can happen if not registered - simplest to just catch the exception
- e.printStackTrace();
- }
bluetoothRemoteControl.stopRemoteControl();
freeAudioListener(false);
- speechControl.stopSpeechRecognizer();
+ //speechControl.stopSpeechRecognizer();
applicationInterface.getLocationSupplier().freeLocationListeners();
applicationInterface.stopPanorama(true); // in practice not needed as we should stop panorama when camera is closed, but good to do it explicitly here, before disabling the gyro sensors
applicationInterface.getGyroSensor().disableSensors();
+ applicationInterface.getImageSaver().onPause();
soundPoolManager.releaseSound();
applicationInterface.clearLastImages(); // this should happen when pausing the preview, but call explicitly just to be safe
applicationInterface.getDrawPreview().clearGhostImage();
preview.onPause();
+ applicationInterface.getDrawPreview().setCoverPreview(true); // must be after we've closed the preview (otherwise risk that further frames from preview will unset the cover_preview flag in DrawPreview)
if( applicationInterface.getImageSaver().getNImagesToSave() > 0) {
createImageSavingNotification();
}
+ if( update_gallery_future != null ) {
+ update_gallery_future.cancel(true);
+ }
+
// intentionally do this again, just in case something turned location on since - keep this right at the end:
applicationInterface.getLocationSupplier().freeLocationListeners();
@@ -1345,16 +1622,241 @@ public class MainActivity extends Activity {
}
}
+ @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR1)
+ private class MyDisplayListener implements DisplayManager.DisplayListener {
+ private int old_rotation;
+
+ private MyDisplayListener() {
+ int rotation = MainActivity.this.getWindowManager().getDefaultDisplay().getRotation();
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "MyDisplayListener");
+ Log.d(TAG, "rotation: " + rotation);
+ }
+ old_rotation = rotation;
+ }
+
+ @Override
+ public void onDisplayAdded(int displayId) {
+ }
+
+ @Override
+ public void onDisplayRemoved(int displayId) {
+ }
+
+ @Override
+ public void onDisplayChanged(int displayId) {
+ int rotation = MainActivity.this.getWindowManager().getDefaultDisplay().getRotation();
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onDisplayChanged: " + displayId);
+ Log.d(TAG, "rotation: " + rotation);
+ Log.d(TAG, "old_rotation: " + rotation);
+ }
+ if( ( rotation == Surface.ROTATION_0 && old_rotation == Surface.ROTATION_180 ) ||
+ ( rotation == Surface.ROTATION_180 && old_rotation == Surface.ROTATION_0 ) ||
+ ( rotation == Surface.ROTATION_90 && old_rotation == Surface.ROTATION_270 ) ||
+ ( rotation == Surface.ROTATION_270 && old_rotation == Surface.ROTATION_90 )
+ ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "switched between landscape and reverse orientation");
+ onSystemOrientationChanged();
+ }
+
+ old_rotation = rotation;
+ }
+ }
+
+ /** Creates and registers a display listener, needed to handle switches between landscape and
+ * reverse landscape (without going via portrait) when lock_to_landscape==false.
+ */
+ private void registerDisplayListener() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "registerDisplayListener");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1 && !lock_to_landscape ) {
+ displayListener = new MyDisplayListener();
+ DisplayManager displayManager = (DisplayManager) this.getSystemService(Context.DISPLAY_SERVICE);
+ displayManager.registerDisplayListener(displayListener, null);
+ }
+ }
+
+ private void unregisterDisplayListener() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "unregisterDisplayListener");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1 && displayListener != null ) {
+ DisplayManager displayManager = (DisplayManager) this.getSystemService(Context.DISPLAY_SERVICE);
+ displayManager.unregisterDisplayListener(displayListener);
+ displayListener = null;
+ }
+ }
+
@Override
- public void onConfigurationChanged(Configuration newConfig) {
+ public void onConfigurationChanged(@NonNull Configuration newConfig) {
if( MyDebug.LOG )
- Log.d(TAG, "onConfigurationChanged()");
+ Log.d(TAG, "onConfigurationChanged(): " + newConfig.orientation);
// configuration change can include screen orientation (landscape/portrait) when not locked (when settings is open)
// needed if app is paused/resumed when settings is open and device is in portrait mode
- preview.setCameraDisplayOrientation();
+ // update: need this all the time when lock_to_landscape==false
+ onSystemOrientationChanged();
super.onConfigurationChanged(newConfig);
}
+ private void onSystemOrientationChanged() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onSystemOrientationChanged");
+
+ // n.b., need to call this first, before preview.setCameraDisplayOrientation(), since
+ // preview.setCameraDisplayOrientation() will call getDisplayRotation() and we don't want
+ // to be using the outdated cached value now that the rotation has changed!
+ resetCachedSystemOrientation();
+
+ preview.setCameraDisplayOrientation();
+ if( !lock_to_landscape ) {
+ SystemOrientation newSystemOrientation = getSystemOrientation();
+ if( hasOldSystemOrientation && oldSystemOrientation == newSystemOrientation ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onSystemOrientationChanged: orientation hasn't changed");
+ }
+ else {
+ if( hasOldSystemOrientation ) {
+ // handle rotation animation
+ int start_rotation = getRotationFromSystemOrientation(oldSystemOrientation) - getRotationFromSystemOrientation(newSystemOrientation);
+ if( MyDebug.LOG )
+ Log.d(TAG, "start_rotation: " + start_rotation);
+ if( start_rotation < -180 )
+ start_rotation += 360;
+ else if( start_rotation > 180 )
+ start_rotation -= 360;
+ mainUI.layoutUIWithRotation(start_rotation);
+ }
+ else {
+ mainUI.layoutUI();
+ }
+ applicationInterface.getDrawPreview().updateSettings();
+
+ hasOldSystemOrientation = true;
+ oldSystemOrientation = newSystemOrientation;
+ }
+ }
+ }
+
+ /** Returns the current system orientation.
+ * Note if lock_to_landscape is true, this always returns LANDSCAPE even if called when we're
+ * allowing configuration changes (e.g., in Settings or a dialog is showing). (This method,
+ * and hence calls to it, were added to support lock_to_landscape==false behaviour, and we
+ * want to avoid changing behaviour for lock_to_landscape==true behaviour.)
+ * Note that this also caches the orientation: firstly for performance (as this is called from
+ * DrawPreview), secondly to support REVERSE_LANDSCAPE, we don't want a sudden change if
+ * getDefaultDisplay().getRotation() changes after the configuration changes.
+ */
+ public SystemOrientation getSystemOrientation() {
+ if( lock_to_landscape ) {
+ return SystemOrientation.LANDSCAPE;
+ }
+ if( has_cached_system_orientation ) {
+ return cached_system_orientation;
+ }
+ SystemOrientation result;
+ int system_orientation = getResources().getConfiguration().orientation;
+ if( MyDebug.LOG )
+ Log.d(TAG, "system orientation: " + system_orientation);
+ switch( system_orientation ) {
+ case Configuration.ORIENTATION_LANDSCAPE:
+ result = SystemOrientation.LANDSCAPE;
+ // now try to distinguish between landscape and reverse landscape
+
+ // check whether the display matches the landscape configuration, in case this is inconsistent?
+ Point display_size = new Point();
+ Display display = getWindowManager().getDefaultDisplay();
+ display.getSize(display_size);
+ if( display_size.x > display_size.y ) {
+ int rotation = getWindowManager().getDefaultDisplay().getRotation();
+ if( MyDebug.LOG )
+ Log.d(TAG, "rotation: " + rotation);
+ switch( rotation ) {
+ case Surface.ROTATION_0:
+ case Surface.ROTATION_90:
+ // landscape
+ if( MyDebug.LOG )
+ Log.d(TAG, "landscape");
+ break;
+ case Surface.ROTATION_180:
+ case Surface.ROTATION_270:
+ // reverse landscape
+ if( MyDebug.LOG )
+ Log.d(TAG, "reverse landscape");
+ result = SystemOrientation.REVERSE_LANDSCAPE;
+ break;
+ default:
+ if( MyDebug.LOG )
+ Log.e(TAG, "unknown rotation: " + rotation);
+ break;
+ }
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.e(TAG, "display size not landscape: " + display_size);
+ }
+ break;
+ case Configuration.ORIENTATION_PORTRAIT:
+ result = SystemOrientation.PORTRAIT;
+ break;
+ case Configuration.ORIENTATION_SQUARE:
+ case Configuration.ORIENTATION_UNDEFINED:
+ default:
+ if( MyDebug.LOG )
+ Log.e(TAG, "unknown system orientation: " + system_orientation);
+ result = SystemOrientation.LANDSCAPE;
+ break;
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "system orientation is now: " + result);
+ this.has_cached_system_orientation = true;
+ this.cached_system_orientation = result;
+ return result;
+ }
+
+ /** Returns rotation in degrees (as a multiple of 90 degrees) corresponding to the supplied
+ * system orientation.
+ */
+ public static int getRotationFromSystemOrientation(SystemOrientation system_orientation) {
+ int rotation;
+ if( system_orientation == MainActivity.SystemOrientation.PORTRAIT )
+ rotation = 270;
+ else if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE )
+ rotation = 180;
+ else
+ rotation = 0;
+ return rotation;
+ }
+
+ private void resetCachedSystemOrientation() {
+ this.has_cached_system_orientation = false;
+ this.has_cached_display_rotation = false;
+ }
+
+ /** A wrapper for getWindowManager().getDefaultDisplay().getRotation(), except if
+ * lock_to_landscape==false, this checks for the display being inconsistent with the system
+ * orientation, and if so, returns a cached value.
+ */
+ public int getDisplayRotation() {
+ if( lock_to_landscape ) {
+ return getWindowManager().getDefaultDisplay().getRotation();
+ }
+ // we cache to reduce effect of annoying problem where rotation changes shortly before the
+ // configuration actually changes (several frames), so on-screen elements would briefly show
+ // in wrong location when device rotates from/to portrait and landscape; also not a bad idea
+ // to cache for performance anyway, to avoid calling
+ // getWindowManager().getDefaultDisplay().getRotation() every frame
+ long time_ms = System.currentTimeMillis();
+ if( has_cached_display_rotation && time_ms < cached_display_rotation_time_ms + 1000 ) {
+ return cached_display_rotation;
+ }
+ has_cached_display_rotation = true;
+ int rotation = getWindowManager().getDefaultDisplay().getRotation();
+ cached_display_rotation = rotation;
+ cached_display_rotation_time_ms = time_ms;
+ return rotation;
+ }
+
public void waitUntilImageQueueEmpty() {
if( MyDebug.LOG )
Log.d(TAG, "waitUntilImageQueueEmpty");
@@ -1479,6 +1981,9 @@ public class MainActivity extends Activity {
applicationInterface.getDrawPreview().updateSettings(); // because we cache the geotagging setting
initLocation(); // required to enable or disable GPS, also requests permission if necessary
this.closePopup();
+
+ String message = getResources().getString(R.string.preference_location) + ": " + getResources().getString(value ? R.string.on : R.string.off);
+ preview.showToast(store_location_toast, message);
}
public void clickedTextStamp(View view) {
@@ -1489,9 +1994,12 @@ public class MainActivity extends Activity {
AlertDialog.Builder alertDialog = new AlertDialog.Builder(this);
alertDialog.setTitle(R.string.preference_textstamp);
- final EditText editText = new EditText(this);
+ final View dialog_view = LayoutInflater.from(this).inflate(R.layout.alertdialog_edittext, null);
+ final EditText editText = dialog_view.findViewById(R.id.edit_text);
+ // set hint instead of content description for EditText, see https://support.google.com/accessibility/android/answer/6378120
+ editText.setHint(getResources().getString(R.string.preference_textstamp));
editText.setText(applicationInterface.getTextStampPref());
- alertDialog.setView(editText);
+ alertDialog.setView(dialog_view);
alertDialog.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
@@ -1631,7 +2139,7 @@ public class MainActivity extends Activity {
this.closePopup();
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String audio_control = sharedPreferences.getString(PreferenceKeys.AudioControlPreferenceKey, "none");
- if( audio_control.equals("voice") && speechControl.hasSpeechRecognition() ) {
+ /*if( audio_control.equals("voice") && speechControl.hasSpeechRecognition() ) {
if( speechControl.isStarted() ) {
speechControl.stopListening();
}
@@ -1649,15 +2157,13 @@ public class MainActivity extends Activity {
}
}
if( has_audio_permission ) {
- String toast_string = this.getResources().getString(R.string.speech_recognizer_started) + "\n" +
- this.getResources().getString(R.string.speech_recognizer_extra_info);
- preview.showToast(audio_control_toast, toast_string);
+ speechControl.showToast(true);
speechControl.startSpeechRecognizerIntent();
speechControl.speechRecognizerStarted();
}
}
}
- else if( audio_control.equals("noise") ){
+ else*/ if( audio_control.equals("noise") ){
if( audio_listener != null ) {
freeAudioListener(false);
}
@@ -1953,6 +2459,12 @@ public class MainActivity extends Activity {
if( MyDebug.LOG )
Log.d(TAG, "onSharedPreferenceChanged: " + key);
+ if( key == null ) {
+ // on Android 11+, when targetting Android 11+, this method is called with key==null
+ // if preferences are cleared (see testSettings(), or when doing "Reset settings")
+ return;
+ }
+
any_change = true;
switch( key ) {
@@ -2132,6 +2644,9 @@ public class MainActivity extends Activity {
bundle.putInt("nCameras", preview.getCameraControllerManager().getNumberOfCameras());
bundle.putString("camera_api", this.preview.getCameraAPI());
bundle.putBoolean("using_android_l", this.preview.usingCamera2API());
+ if( this.preview.getCameraController() != null ) {
+ bundle.putInt("camera_orientation", this.preview.getCameraController().getCameraOrientation());
+ }
bundle.putString("photo_mode_string", getPhotoModeString(applicationInterface.getPhotoMode(), true));
bundle.putBoolean("supports_auto_stabilise", this.supports_auto_stabilise);
bundle.putBoolean("supports_flash", this.preview.supportsFlash());
@@ -2172,6 +2687,8 @@ public class MainActivity extends Activity {
bundle.putBoolean("supports_photo_video_recording", this.preview.supportsPhotoVideoRecording());
bundle.putFloat("camera_view_angle_x", preview.getViewAngleX(false));
bundle.putFloat("camera_view_angle_y", preview.getViewAngleY(false));
+ bundle.putFloat("min_zoom_factor", preview.getMinZoomRatio());
+ bundle.putFloat("max_zoom_factor", preview.getMaxZoomRatio());
putBundleExtra(bundle, "color_effects", this.preview.getSupportedColorEffects());
putBundleExtra(bundle, "scene_modes", this.preview.getSupportedSceneModes());
@@ -2358,7 +2875,7 @@ public class MainActivity extends Activity {
preferencesListener.startListening();
showPreview(false);
- setWindowFlagsForSettings();
+ setWindowFlagsForSettings(); // important to do after passing camera info into bundle, since this will close the camera
MyPreferenceFragment fragment = new MyPreferenceFragment();
fragment.setArguments(bundle);
// use commitAllowingStateLoss() instead of commit(), does to "java.lang.IllegalStateException: Can not perform this action after onSaveInstanceState" crash seen on Google Play
@@ -2366,23 +2883,25 @@ public class MainActivity extends Activity {
getFragmentManager().beginTransaction().add(android.R.id.content, fragment, "PREFERENCE_FRAGMENT").addToBackStack(null).commitAllowingStateLoss();
}
- public void updateForSettings() {
- updateForSettings(null, false);
+ public void updateForSettings(boolean update_camera) {
+ updateForSettings(update_camera, null, false);
}
- public void updateForSettings(String toast_message) {
- updateForSettings(toast_message, false);
+ public void updateForSettings(boolean update_camera, String toast_message) {
+ updateForSettings(update_camera, toast_message, false);
}
/** Must be called when an settings (as stored in SharedPreferences) are made, so we can update the
* camera, and make any other necessary changes.
+ * @param update_camera Whether the camera needs to be updated. Can be set to false if we know changes
+ * haven't been made to the camera settings, or we already reopened it.
* @param toast_message If non-null, display this toast instead of the usual camera "startup" toast
* that's shown in showPhotoVideoToast(). If non-null but an empty string, then
* this means no toast is shown at all.
* @param keep_popup If false, the popup will be closed and destroyed. Set to true if you're sure
* that the changed setting isn't one that requires the PopupView to be recreated
*/
- public void updateForSettings(String toast_message, boolean keep_popup) {
+ public void updateForSettings(boolean update_camera, String toast_message, boolean keep_popup) {
if( MyDebug.LOG ) {
Log.d(TAG, "updateForSettings()");
if( toast_message != null ) {
@@ -2422,13 +2941,17 @@ public class MainActivity extends Activity {
// doesn't happen if we allow using Camera2 API on Nexus 7, but reopen for consistency (and changing scene modes via
// popup menu no longer should be calling updateForSettings() for Camera2, anyway)
boolean need_reopen = false;
- if( preview.getCameraController() != null ) {
+ if( update_camera && preview.getCameraController() != null ) {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String scene_mode = preview.getCameraController().getSceneMode();
if( MyDebug.LOG )
Log.d(TAG, "scene mode was: " + scene_mode);
String key = PreferenceKeys.SceneModePreferenceKey;
String value = sharedPreferences.getString(key, CameraController.SCENE_MODE_DEFAULT);
+ // n.b., on Android 4.3 emulator, scene mode is returned as null (this may be because it doesn't support
+ // scene modes at all) - treat this the same as auto
+ if( scene_mode == null )
+ scene_mode = CameraController.SCENE_MODE_DEFAULT;
if( !value.equals(scene_mode) ) {
if( MyDebug.LOG )
Log.d(TAG, "scene mode changed to: " + value);
@@ -2461,8 +2984,29 @@ public class MainActivity extends Activity {
}
}
}
+
+ if( !need_reopen ) {
+ boolean old_is_extension = preview.getCameraController().isCameraExtension();
+ boolean new_is_extension = applicationInterface.isCameraExtensionPref();
+ if( old_is_extension || new_is_extension ) {
+ // At least on Galaxy S10e, we have problems stopping and starting a camera extension session,
+ // e.g., when changing resolutions whilst in an extension mode (XHDR or bokeh) or switching
+ // from XHDR to other modes (including non-extension modes like STD). Problems such as preview
+ // no longer receiving frames, or the call to createExtensionSession() (or createCaptureSession)
+ // hanging. So therefore we should reopen the camera if at least
+ // old_is_extension==true.
+ // This isn't required if old_is_extension==false but new_is_extension==true,
+ // but we still do so since reopening the camera occurs on a background thread
+ // (opening an extension session seems to take longer, so better not to block
+ // the UI thread).
+ if( MyDebug.LOG )
+ Log.d(TAG, "need to reopen camera for changes to extension session");
+ need_reopen = true;
+ }
+ }
}
if( MyDebug.LOG ) {
+ Log.d(TAG, "need_reopen: " + need_reopen);
Log.d(TAG, "updateForSettings: time after check need_reopen: " + (System.currentTimeMillis() - debug_time));
}
@@ -2476,12 +3020,16 @@ public class MainActivity extends Activity {
checkDisableGUIIcons();
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
- if( sharedPreferences.getString(PreferenceKeys.AudioControlPreferenceKey, "none").equals("none") ) {
+ String audio_control = sharedPreferences.getString(PreferenceKeys.AudioControlPreferenceKey, "none");
+ // better to only display the audio control icon if it matches specific known supported types
+ // (important now that "voice" is no longer supported)
+ //if( !audio_control.equals("voice") && !audio_control.equals("noise") ) {
+ if( !audio_control.equals("noise") ) {
View speechRecognizerButton = findViewById(R.id.audio_control);
speechRecognizerButton.setVisibility(View.GONE);
}
- speechControl.initSpeechRecognizer(); // in case we've enabled or disabled speech recognizer
+ //speechControl.initSpeechRecognizer(); // in case we've enabled or disabled speech recognizer
// we no longer call initLocation() here (for having enabled or disabled geotagging), as that's
// done in setWindowFlagsForCamera() - important not to call it here as well, otherwise if
@@ -2495,7 +3043,10 @@ public class MainActivity extends Activity {
}
if( toast_message != null )
block_startup_toast = true;
- if( need_reopen || preview.getCameraController() == null ) { // if camera couldn't be opened before, might as well try again
+ if( !update_camera ) {
+ // don't try to update camera
+ }
+ else if( need_reopen || preview.getCameraController() == null ) { // if camera couldn't be opened before, might as well try again
preview.reopenCamera();
if( MyDebug.LOG ) {
Log.d(TAG, "updateForSettings: time after reopen: " + (System.currentTimeMillis() - debug_time));
@@ -2542,6 +3093,11 @@ public class MainActivity extends Activity {
if( MyDebug.LOG )
Log.d(TAG, "checkDisableGUIIcons");
boolean changed = false;
+ if( !supportsExposureButton() ) {
+ View button = findViewById(R.id.exposure);
+ changed = changed || (button.getVisibility() != View.GONE);
+ button.setVisibility(View.GONE);
+ }
if( !mainUI.showExposureLockIcon() ) {
View button = findViewById(R.id.exposure_lock);
changed = changed || (button.getVisibility() != View.GONE);
@@ -2594,6 +3150,8 @@ public class MainActivity extends Activity {
changed = changed || (button.getVisibility() != View.GONE);
button.setVisibility(View.GONE);
}
+ if( MyDebug.LOG )
+ Log.d(TAG, "checkDisableGUIIcons: " + changed);
return changed;
}
@@ -2628,7 +3186,8 @@ public class MainActivity extends Activity {
}
if( preferencesListener.anySignificantChange() ) {
- updateForSettings();
+ // don't need to update camera, as we now pause/resume camera when going to settings
+ updateForSettings(false);
}
else {
if( MyDebug.LOG )
@@ -2667,14 +3226,207 @@ public class MainActivity extends Activity {
super.onBackPressed();
}
- public void initImmersiveMode() {
- setImmersiveMode(true);
+ /** Whether to allow the application to show under the navigation bar, or not.
+ * Arguably we could enable this all the time, but in practice we only enable for cases when
+ * want_no_limits==true and navigation_gap!=0 (if want_no_limits==false, there's no need to
+ * show under the navigation bar; if navigation_gap==0, there is no navigation bar).
+ */
+ private void showUnderNavigation(boolean enable) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "showUnderNavigation: " + enable);
+
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN ) {
+ // We used to use window flag FLAG_LAYOUT_NO_LIMITS, but this didn't work properly on
+ // Android 11 (didn't take effect until orientation changed or application paused/resumed).
+ // Although system ui visibility flags are deprecated on Android 11, this still works better
+ // than the FLAG_LAYOUT_NO_LIMITS flag (which was not well documented anyway).
+ int flags = getWindow().getDecorView().getSystemUiVisibility();
+ if( enable ) {
+ getWindow().getDecorView().setSystemUiVisibility(flags | View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION);
+ }
+ else {
+ getWindow().getDecorView().setSystemUiVisibility(flags & ~View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION);
+ }
+ }
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP ) {
+ getWindow().setNavigationBarColor(enable ? Color.TRANSPARENT : Color.BLACK);
+ }
}
- void setImmersiveMode(boolean on) {
+ /** The system is now such that we have entered or exited immersive mode. If visible is true,
+ * system UI is now visible such that we should exit immersive mode. If visible is false, the
+ * system has entered immersive mode.
+ */
+ private void immersiveModeChanged(boolean visible) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "immersiveModeChanged: " + visible);
+ if( !usingKitKatImmersiveMode() )
+ return;
+
+ SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(MainActivity.this);
+ String immersive_mode = sharedPreferences.getString(PreferenceKeys.ImmersiveModePreferenceKey, "immersive_mode_low_profile");
+ boolean hide_ui = immersive_mode.equals("immersive_mode_gui") || immersive_mode.equals("immersive_mode_everything");
+
+ if( visible ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "system bars now visible");
+ // change UI due to having exited immersive mode
+ if( hide_ui )
+ mainUI.setImmersiveMode(false);
+ setImmersiveTimer();
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "system bars now NOT visible");
+ // change UI due to having entered immersive mode
+ if( hide_ui )
+ mainUI.setImmersiveMode(true);
+ }
+ }
+
+ /** Set up listener to handle listening for system ui changes (for immersive mode), and setting
+ * a WindowsInsetsListener to find the navigation_gap.
+ */
+ private void setupSystemUiVisibilityListener() {
+ View decorView = getWindow().getDecorView();
+
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP ) {
+ // set a window insets listener to find the navigation_gap
+ if( MyDebug.LOG )
+ Log.d(TAG, "set a window insets listener");
+ this.set_window_insets_listener = true;
+ decorView.getRootView().setOnApplyWindowInsetsListener(new View.OnApplyWindowInsetsListener() {
+ @Override
+ public WindowInsets onApplyWindowInsets(View v, WindowInsets insets) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "inset right: " + insets.getSystemWindowInsetRight());
+ Log.d(TAG, "inset bottom: " + insets.getSystemWindowInsetBottom());
+ }
+ if( navigation_gap == 0 ) {
+ SystemOrientation system_orientation = getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == SystemOrientation.PORTRAIT;
+ navigation_gap = system_orientation_portrait ? insets.getSystemWindowInsetBottom() : insets.getSystemWindowInsetRight();
+ if( MyDebug.LOG )
+ Log.d(TAG, "navigation_gap is " + navigation_gap);
+ // Sometimes when this callback is called, the navigation_gap may still be 0 even if
+ // the device doesn't have physical navigation buttons - we need to wait
+ // until we have found a non-zero value before switching to no limits.
+ // On devices with physical navigation bar, navigation_gap should remain 0
+ // (and there's no point setting FLAG_LAYOUT_NO_LIMITS)
+ if( want_no_limits && navigation_gap != 0 ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "set FLAG_LAYOUT_NO_LIMITS");
+ showUnderNavigation(true);
+ }
+ }
+
+ // see comments in MainUI.layoutUI() for why we don't use this
+ /*if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S && getSystemOrientation() == SystemOrientation.LANDSCAPE ) {
+ Rect privacy_indicator_rect = insets.getPrivacyIndicatorBounds();
+ if( privacy_indicator_rect != null ) {
+ Rect window_bounds = getWindowManager().getCurrentWindowMetrics().getBounds();
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "privacy_indicator_rect: " + privacy_indicator_rect);
+ Log.d(TAG, "window_bounds: " + window_bounds);
+ }
+ privacy_indicator_gap = window_bounds.right - privacy_indicator_rect.left;
+ if( privacy_indicator_gap < 0 )
+ privacy_indicator_gap = 0; // just in case??
+ if( MyDebug.LOG )
+ Log.d(TAG, "privacy_indicator_gap: " + privacy_indicator_gap);
+ }
+ }
+ else {
+ privacy_indicator_gap = 0;
+ }*/
+ return getWindow().getDecorView().getRootView().onApplyWindowInsets(insets);
+ }
+ });
+ }
+
+ decorView.setOnSystemUiVisibilityChangeListener
+ (new View.OnSystemUiVisibilityChangeListener() {
+ @Override
+ public void onSystemUiVisibilityChange(int visibility) {
+ // Note that system bars will only be "visible" if none of the
+ // LOW_PROFILE, HIDE_NAVIGATION, or FULLSCREEN flags are set.
+
+ if( MyDebug.LOG )
+ Log.d(TAG, "onSystemUiVisibilityChange: " + visibility);
+
+ // Note that Android example code says to test against SYSTEM_UI_FLAG_FULLSCREEN,
+ // but this stopped working on Android 11, as when calling setSystemUiVisibility(0)
+ // to exit immersive mode, when we arrive here the flag SYSTEM_UI_FLAG_FULLSCREEN
+ // is still set. Fixed by checking for SYSTEM_UI_FLAG_HIDE_NAVIGATION instead -
+ // which makes some sense since we run in fullscreen mode all the time anyway.
+ //if( (visibility & View.SYSTEM_UI_FLAG_FULLSCREEN) == 0 ) {
+ if( (visibility & View.SYSTEM_UI_FLAG_HIDE_NAVIGATION) == 0 ) {
+ immersiveModeChanged(true);
+ }
+ else {
+ immersiveModeChanged(false);
+ }
+ }
+ });
+ }
+
+ public boolean usingKitKatImmersiveMode() {
+ // whether we are using a Kit Kat style immersive mode (either hiding navigation bar, GUI, or everything)
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT ) {
+ SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
+ String immersive_mode = sharedPreferences.getString(PreferenceKeys.ImmersiveModePreferenceKey, "immersive_mode_low_profile");
+ if( immersive_mode.equals("immersive_mode_navigation") || immersive_mode.equals("immersive_mode_gui") || immersive_mode.equals("immersive_mode_everything") )
+ return true;
+ }
+ return false;
+ }
+
+ public boolean usingKitKatImmersiveModeEverything() {
+ // whether we are using a Kit Kat style immersive mode for everything
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT ) {
+ SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
+ String immersive_mode = sharedPreferences.getString(PreferenceKeys.ImmersiveModePreferenceKey, "immersive_mode_low_profile");
+ if( immersive_mode.equals("immersive_mode_everything") )
+ return true;
+ }
+ return false;
+ }
+
+
+ private Handler immersive_timer_handler = null;
+ private Runnable immersive_timer_runnable = null;
+
+ private void setImmersiveTimer() {
+ if( immersive_timer_handler != null && immersive_timer_runnable != null ) {
+ immersive_timer_handler.removeCallbacks(immersive_timer_runnable);
+ }
+ immersive_timer_handler = new Handler();
+ immersive_timer_handler.postDelayed(immersive_timer_runnable = new Runnable(){
+ @Override
+ public void run(){
+ if( MyDebug.LOG )
+ Log.d(TAG, "setImmersiveTimer: run");
+ if( !camera_in_background && !popupIsOpen() && usingKitKatImmersiveMode() )
+ setImmersiveMode(true);
+ }
+ }, 5000);
+ }
+
+ public void initImmersiveMode() {
+ setImmersiveMode(true);
+ }
+
+ void setImmersiveMode(boolean on) {
if( MyDebug.LOG )
Log.d(TAG, "setImmersiveMode: " + on);
// n.b., preview.setImmersiveMode() is called from onSystemUiVisibilityChange()
+ int saved_flags = 0;
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN ) {
+ // save whether we set SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
+ saved_flags = getWindow().getDecorView().getSystemUiVisibility() & View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION;
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "saved_flags?: " + saved_flags);
if (on) {
getWindow().getDecorView().setSystemUiVisibility(View.SYSTEM_UI_FLAG_LOW_PROFILE);
} else {
@@ -2761,12 +3513,19 @@ public class MainActivity extends Activity {
}*/
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
- // force to landscape mode
- setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
- //setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE); // testing for devices with unusual sensor orientation (e.g., Nexus 5X)
+ if( lock_to_landscape ) {
+ // force to landscape mode
+ setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
+ //setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE); // testing for devices with unusual sensor orientation (e.g., Nexus 5X)
+ }
+ else {
+ // allow orientation to change for camera, even if user has locked orientation
+ setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_SENSOR);
+ }
if( preview != null ) {
- // also need to call setCameraDisplayOrientation, as this handles if the user switched from portrait to reverse landscape whilst in settings/etc
+ // also need to call preview.setCameraDisplayOrientation, as this handles if the user switched from portrait to reverse landscape whilst in settings/etc
// as switching from reverse landscape back to landscape isn't detected in onConfigurationChanged
+ // update: now probably irrelevant now that we close/reopen the camera, but keep it here anyway
preview.setCameraDisplayOrientation();
}
if( preview != null && mainUI != null ) {
@@ -2793,7 +3552,7 @@ public class MainActivity extends Activity {
Log.d(TAG, "don't keep screen on");
this.getWindow().clearFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
}
- if( sharedPreferences.getBoolean(PreferenceKeys.ShowWhenLockedPreferenceKey, true) ) {
+ if( sharedPreferences.getBoolean(PreferenceKeys.ShowWhenLockedPreferenceKey, false) ) {
if( MyDebug.LOG )
Log.d(TAG, "do show when locked");
// keep Open Camera on top of screen-lock (will still need to unlock when going to gallery or settings)
@@ -2808,7 +3567,7 @@ public class MainActivity extends Activity {
if( want_no_limits && navigation_gap != 0 ) {
if( MyDebug.LOG )
Log.d(TAG, "set FLAG_LAYOUT_NO_LIMITS");
- getWindow().addFlags(WindowManager.LayoutParams.FLAG_LAYOUT_NO_LIMITS);
+ showUnderNavigation(true);
}
setBrightnessForCamera(false);
@@ -2824,6 +3583,11 @@ public class MainActivity extends Activity {
// app is paused. It can happen here because setWindowFlagsForCamera() is called from
// onCreate()
initLocation();
+
+ // Similarly only want to reopen the camera if no longer paused
+ if( preview != null ) {
+ preview.onResume();
+ }
}
}
@@ -2850,7 +3614,7 @@ public class MainActivity extends Activity {
if( want_no_limits && navigation_gap != 0 ) {
if( MyDebug.LOG )
Log.d(TAG, "clear FLAG_LAYOUT_NO_LIMITS");
- getWindow().clearFlags(WindowManager.LayoutParams.FLAG_LAYOUT_NO_LIMITS);
+ showUnderNavigation(false);
}
if( set_lock_protect ) {
// settings should still be protected by screen lock
@@ -2869,6 +3633,9 @@ public class MainActivity extends Activity {
// we disable location listening when showing settings or a dialog etc - saves battery life, also better for privacy
applicationInterface.getLocationSupplier().freeLocationListeners();
+
+ // similarly we close the camera
+ preview.onPause(false);
}
private void showWhenLocked(boolean show) {
@@ -2922,6 +3689,145 @@ public class MainActivity extends Activity {
container.setVisibility(show ? View.GONE : View.VISIBLE);
}
+ /** Rotates the supplied bitmap according to the orientation tag stored in the exif data. If no
+ * rotation is required, the input bitmap is returned. If rotation is required, the input
+ * bitmap is recycled.
+ * @param uri Uri containing the JPEG with Exif information to use.
+ */
+ public Bitmap rotateForExif(Bitmap bitmap, Uri uri) throws IOException {
+ ExifInterface exif;
+ InputStream inputStream = null;
+ try {
+ inputStream = this.getContentResolver().openInputStream(uri);
+ exif = new ExifInterface(inputStream);
+ }
+ finally {
+ if( inputStream != null )
+ inputStream.close();
+ }
+
+ if( exif != null ) {
+ int exif_orientation_s = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_UNDEFINED);
+ boolean needs_tf = false;
+ int exif_orientation = 0;
+ // see http://jpegclub.org/exif_orientation.html
+ // and http://stackoverflow.com/questions/20478765/how-to-get-the-correct-orientation-of-the-image-selected-from-the-default-image
+ if( exif_orientation_s == ExifInterface.ORIENTATION_UNDEFINED || exif_orientation_s == ExifInterface.ORIENTATION_NORMAL ) {
+ // leave unchanged
+ }
+ else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_180 ) {
+ needs_tf = true;
+ exif_orientation = 180;
+ }
+ else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_90 ) {
+ needs_tf = true;
+ exif_orientation = 90;
+ }
+ else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_270 ) {
+ needs_tf = true;
+ exif_orientation = 270;
+ }
+ else {
+ // just leave unchanged for now
+ if( MyDebug.LOG )
+ Log.e(TAG, " unsupported exif orientation: " + exif_orientation_s);
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, " exif orientation: " + exif_orientation);
+
+ if( needs_tf ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, " need to rotate bitmap due to exif orientation tag");
+ Matrix m = new Matrix();
+ m.setRotate(exif_orientation, bitmap.getWidth() * 0.5f, bitmap.getHeight() * 0.5f);
+ Bitmap rotated_bitmap = Bitmap.createBitmap(bitmap, 0, 0,bitmap.getWidth(), bitmap.getHeight(), m, true);
+ if( rotated_bitmap != bitmap ) {
+ bitmap.recycle();
+ bitmap = rotated_bitmap;
+ }
+ }
+ }
+ return bitmap;
+ }
+
+ /** Loads a thumbnail from the supplied image uri (not videos). Note this loads from the bitmap
+ * rather than reading from MediaStore. Therefore this works with SAF uris as well as
+ * MediaStore uris, as well as allowing control over the resolution of the thumbnail.
+ * If sample_factor is 1, this returns a bitmap scaled to match the display resolution. If
+ * sample_factor is greater than 1, it will be scaled down to a lower resolution.
+ * @param mediastore Whether the uri is for a mediastore uri or not.
+ */
+ private Bitmap loadThumbnailFromUri(Uri uri, int sample_factor, boolean mediastore) {
+ Bitmap thumbnail = null;
+ try {
+ //thumbnail = MediaStore.Images.Media.getBitmap(getContentResolver(), media.uri);
+ // only need to load a bitmap as large as the screen size
+ BitmapFactory.Options options = new BitmapFactory.Options();
+ InputStream is = getContentResolver().openInputStream(uri);
+ // get dimensions
+ options.inJustDecodeBounds = true;
+ BitmapFactory.decodeStream(is, null, options);
+ int bitmap_width = options.outWidth;
+ int bitmap_height = options.outHeight;
+ Point display_size = new Point();
+ Display display = getWindowManager().getDefaultDisplay();
+ display.getSize(display_size);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "bitmap_width: " + bitmap_width);
+ Log.d(TAG, "bitmap_height: " + bitmap_height);
+ Log.d(TAG, "display width: " + display_size.x);
+ Log.d(TAG, "display height: " + display_size.y);
+ }
+ // align dimensions
+ if( display_size.x < display_size.y ) {
+ //noinspection SuspiciousNameCombination
+ display_size.set(display_size.y, display_size.x);
+ }
+ if( bitmap_width < bitmap_height ) {
+ int dummy = bitmap_width;
+ //noinspection SuspiciousNameCombination
+ bitmap_width = bitmap_height;
+ bitmap_height = dummy;
+ }
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "bitmap_width: " + bitmap_width);
+ Log.d(TAG, "bitmap_height: " + bitmap_height);
+ Log.d(TAG, "display width: " + display_size.x);
+ Log.d(TAG, "display height: " + display_size.y);
+ }
+ // only care about height, to save worrying about different aspect ratios
+ options.inSampleSize = 1;
+ while( bitmap_height / (2*options.inSampleSize) >= display_size.y ) {
+ options.inSampleSize *= 2;
+ }
+ options.inSampleSize *= sample_factor;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "inSampleSize: " + options.inSampleSize);
+ }
+ options.inJustDecodeBounds = false;
+ // need a new inputstream, see https://stackoverflow.com/questions/2503628/bitmapfactory-decodestream-returning-null-when-options-are-set
+ is.close();
+ is = getContentResolver().openInputStream(uri);
+ thumbnail = BitmapFactory.decodeStream(is, null, options);
+ if( thumbnail == null ) {
+ Log.e(TAG, "decodeStream returned null bitmap for ghost image last");
+ }
+ is.close();
+
+ if( !mediastore ) {
+ // When loading from a mediastore, the bitmap already seems to have the correct orientation.
+ // But when loading from a saf uri, we need to apply the rotation.
+ // E.g., test on Galaxy S10e with ghost image last image option, when using SAF, in portrait orientation, after pause/resume.
+ thumbnail = rotateForExif(thumbnail, uri);
+ }
+ }
+ catch(IOException e) {
+ Log.e(TAG, "failed to load bitmap for ghost image last");
+ e.printStackTrace();
+ }
+ return thumbnail;
+ }
+
/** Shows the default "blank" gallery icon, when we don't have a thumbnail available.
*/
private void updateGalleryIconToBlank() {
@@ -2942,17 +3848,24 @@ public class MainActivity extends Activity {
gallery_bitmap = null;
}
- /** Shows a thumbnail for the gallery icon.
/** Shows a thumbnail for the gallery icon.
*/
void updateGalleryIcon(Bitmap thumbnail) {
if( MyDebug.LOG )
Log.d(TAG, "updateGalleryIcon: " + thumbnail);
- CircleImageView galleryButton = this.findViewById(R.id.gallery);
- galleryButton.setImageBitmap(thumbnail);
- galleryButton.setBorderWidth(6);
- gallery_bitmap = thumbnail;
-
+ // If we're currently running the background task to update the gallery (see updateGalleryIcon()), we should cancel that!
+ // Otherwise if user takes a photo whilst the background task is still running, the thumbnail from the latest photo will
+ // be overridden when the background task completes. This is more likely when using SAF on Android 10+ with scoped storage,
+ // due to SAF's poor performance for folders with large number of files.
+ if( update_gallery_future != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "cancel update_gallery_future");
+ update_gallery_future.cancel(true);
+ }
+ CircleImageView galleryButton = this.findViewById(R.id.gallery);
+ galleryButton.setImageBitmap(thumbnail);
+ galleryButton.setBorderWidth(6);
+ gallery_bitmap = thumbnail;
}
/** Updates the gallery icon by searching for the most recent photo.
@@ -2964,17 +3877,27 @@ public class MainActivity extends Activity {
Log.d(TAG, "updateGalleryIcon");
debug_time = System.currentTimeMillis();
}
+ if( update_gallery_future != null ) {
+ Log.d(TAG, "previous updateGalleryIcon task already running");
+ return;
+ }
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String ghost_image_pref = sharedPreferences.getString(PreferenceKeys.GhostImagePreferenceKey, "preference_ghost_image_off");
final boolean ghost_image_last = ghost_image_pref.equals("preference_ghost_image_last");
- new AsyncTask() {
- private static final String TAG = "MainActivity/AsyncTask";
+
+ final Handler handler = new Handler(Looper.getMainLooper());
+
+ //new AsyncTask() {
+ Runnable runnable = new Runnable() {
+ private static final String TAG = "updateGalleryIcon";
+ private Uri uri;
+ private boolean is_raw;
private boolean is_video;
- /** The system calls this to perform work in a worker thread and
- * delivers it the parameters given to AsyncTask.execute() */
- protected Bitmap doInBackground(Void... params) {
+ @Override
+ //protected Bitmap doInBackground(Void... params) {
+ public void run() {
if( MyDebug.LOG )
Log.d(TAG, "doInBackground");
StorageUtils.Media media = applicationInterface.getStorageUtils().getLatestMedia();
@@ -2985,75 +3908,60 @@ public class MainActivity extends Activity {
Log.d(TAG, "is_locked?: " + is_locked);
if( media != null && getContentResolver() != null && !is_locked ) {
// check for getContentResolver() != null, as have had reported Google Play crashes
+
+ uri = media.getMediaStoreUri(MainActivity.this);
+ is_raw = media.filename != null && StorageUtils.filenameIsRaw(media.filename);
+ is_video = media.video;
+
if( ghost_image_last && !media.video ) {
if( MyDebug.LOG )
Log.d(TAG, "load full size bitmap for ghost image last photo");
- try {
- //thumbnail = MediaStore.Images.Media.getBitmap(getContentResolver(), media.uri);
- // only need to load a bitmap as large as the screen size
- BitmapFactory.Options options = new BitmapFactory.Options();
- InputStream is = getContentResolver().openInputStream(media.uri);
- // get dimensions
- options.inJustDecodeBounds = true;
- BitmapFactory.decodeStream(is, null, options);
- int bitmap_width = options.outWidth;
- int bitmap_height = options.outHeight;
- Point display_size = new Point();
- Display display = getWindowManager().getDefaultDisplay();
- display.getSize(display_size);
- if( MyDebug.LOG ) {
- Log.d(TAG, "bitmap_width: " + bitmap_width);
- Log.d(TAG, "bitmap_height: " + bitmap_height);
- Log.d(TAG, "display width: " + display_size.x);
- Log.d(TAG, "display height: " + display_size.y);
- }
- // align dimensions
- if( display_size.x < display_size.y ) {
- //noinspection SuspiciousNameCombination
- display_size.set(display_size.y, display_size.x);
- }
- if( bitmap_width < bitmap_height ) {
- int dummy = bitmap_width;
- //noinspection SuspiciousNameCombination
- bitmap_width = bitmap_height;
- bitmap_height = dummy;
- }
- if( MyDebug.LOG ) {
- Log.d(TAG, "bitmap_width: " + bitmap_width);
- Log.d(TAG, "bitmap_height: " + bitmap_height);
- Log.d(TAG, "display width: " + display_size.x);
- Log.d(TAG, "display height: " + display_size.y);
- }
- // only care about height, to save worrying about different aspect ratios
- options.inSampleSize = 1;
- while( bitmap_height / (2*options.inSampleSize) >= display_size.y ) {
- options.inSampleSize *= 2;
- }
- if( MyDebug.LOG ) {
- Log.d(TAG, "inSampleSize: " + options.inSampleSize);
- }
- options.inJustDecodeBounds = false;
- // need a new inputstream, see https://stackoverflow.com/questions/2503628/bitmapfactory-decodestream-returning-null-when-options-are-set
- is.close();
- is = getContentResolver().openInputStream(media.uri);
- thumbnail = BitmapFactory.decodeStream(is, null, options);
- if( thumbnail == null ) {
- Log.e(TAG, "decodeStream returned null bitmap for ghost image last");
- }
- is.close();
- }
- catch(IOException e) {
- Log.e(TAG, "failed to load bitmap for ghost image last");
- e.printStackTrace();
- }
+ thumbnail = loadThumbnailFromUri(media.uri, 1, media.mediastore);
}
if( thumbnail == null ) {
try {
- if( media.video ) {
+ if( !media.mediastore ) {
+ if( media.video ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "load thumbnail for video from SAF uri");
+ ParcelFileDescriptor pfd_saf = null; // keep a reference to this as long as retriever, to avoid risk of pfd_saf being garbage collected
+ MediaMetadataRetriever retriever = new MediaMetadataRetriever();
+ try {
+ pfd_saf = getContentResolver().openFileDescriptor(media.uri, "r");
+ retriever.setDataSource(pfd_saf.getFileDescriptor());
+ thumbnail = retriever.getFrameAtTime(-1);
+ }
+ catch(Exception e) {
+ Log.d(TAG, "failed to load video thumbnail");
+ e.printStackTrace();
+ }
+ finally {
+ try {
+ retriever.release();
+ }
+ catch(RuntimeException ex) {
+ // ignore
+ }
+ try {
+ if( pfd_saf != null ) {
+ pfd_saf.close();
+ }
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "load thumbnail for photo from SAF uri");
+ thumbnail = loadThumbnailFromUri(media.uri, 4, media.mediastore);
+ }
+ }
+ else if( media.video ) {
if( MyDebug.LOG )
Log.d(TAG, "load thumbnail for video");
thumbnail = MediaStore.Video.Thumbnails.getThumbnail(getContentResolver(), media.id, MediaStore.Video.Thumbnails.MINI_KIND, null);
- is_video = true;
}
else {
if( MyDebug.LOG )
@@ -3092,16 +4000,37 @@ public class MainActivity extends Activity {
}
}
}
- return thumbnail;
+ //return thumbnail;
+
+ final Bitmap thumbnail_f = thumbnail;
+ handler.post(new Runnable() {
+ @Override
+ public void run() {
+ onPostExecute(thumbnail_f);
+ }
+ });
}
- /** The system calls this to perform work in the UI thread and delivers
- * the result from doInBackground() */
- protected void onPostExecute(Bitmap thumbnail) {
+ /** Runs on UI thread, after background work is complete.
+ */
+ private void onPostExecute(Bitmap thumbnail) {
if( MyDebug.LOG )
Log.d(TAG, "onPostExecute");
+ if( update_gallery_future != null && update_gallery_future.isCancelled() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "was cancelled");
+ update_gallery_future = null;
+ return;
+ }
// since we're now setting the thumbnail to the latest media on disk, we need to make sure clicking the Gallery goes to this
applicationInterface.getStorageUtils().clearLastMediaScanned();
+ if( uri != null ) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "found media uri: " + uri);
+ Log.d(TAG, " is_raw?: " + is_raw);
+ }
+ applicationInterface.getStorageUtils().setLastMediaScanned(uri, is_raw);
+ }
if( thumbnail != null ) {
if( MyDebug.LOG )
Log.d(TAG, "set gallery button to thumbnail");
@@ -3113,8 +4042,15 @@ public class MainActivity extends Activity {
Log.d(TAG, "set gallery button to blank");
updateGalleryIconToBlank();
}
+
+ update_gallery_future = null;
}
- }.execute();
+ //}.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR);
+ };
+
+ ExecutorService executor = Executors.newSingleThreadExecutor();
+ //executor.execute(runnable);
+ update_gallery_future = executor.submit(runnable);
if( MyDebug.LOG )
Log.d(TAG, "updateGalleryIcon: total time to update gallery icon: " + (System.currentTimeMillis() - debug_time));
@@ -3214,23 +4150,33 @@ public class MainActivity extends Activity {
Log.d(TAG, "openGallery");
//Intent intent = new Intent(Intent.ACTION_VIEW, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
Uri uri = applicationInterface.getStorageUtils().getLastMediaScanned();
- boolean is_raw = false; // note that getLastMediaScanned() will never return RAW images, as we only record JPEGs
+ boolean is_raw = uri != null && applicationInterface.getStorageUtils().getLastMediaScannedIsRaw();
+ if( MyDebug.LOG && uri != null ) {
+ Log.d(TAG, "found cached most recent uri: " + uri);
+ Log.d(TAG, " is_raw: " + is_raw);
+ }
if( uri == null ) {
if( MyDebug.LOG )
Log.d(TAG, "go to latest media");
StorageUtils.Media media = applicationInterface.getStorageUtils().getLatestMedia();
if( media != null ) {
- uri = media.uri;
- is_raw = media.filename != null && media.filename.toLowerCase(Locale.US).endsWith(".dng");
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "latest uri:" + media.uri);
+ Log.d(TAG, "filename: " + media.filename);
+ }
+ uri = media.getMediaStoreUri(this);
+ if( MyDebug.LOG )
+ Log.d(TAG, "media uri:" + uri);
+ is_raw = media.filename != null && StorageUtils.filenameIsRaw(media.filename);
+ if( MyDebug.LOG )
+ Log.d(TAG, "is_raw:" + is_raw);
}
}
- if( uri != null ) {
+ if( uri != null && !MainActivity.useScopedStorage() ) {
// check uri exists
- if( MyDebug.LOG ) {
- Log.d(TAG, "found most recent uri: " + uri);
- Log.d(TAG, "is_raw: " + is_raw);
- }
+ // note, with scoped storage this isn't reliable when using SAF - since we don't actually have permission to access mediastore URIs that
+ // were created via Storage Access Framework, even though Open Camera was the application that saved them(!)
try {
ContentResolver cr = getContentResolver();
ParcelFileDescriptor pfd = cr.openFileDescriptor(uri, "r");
@@ -3262,9 +4208,11 @@ public class MainActivity extends Activity {
final String REVIEW_ACTION = "com.android.camera.action.REVIEW";
boolean done = false;
if( !is_raw ) {
- // REVIEW_ACTION means we can view video files without autoplaying
- // however, Google Photos at least has problems with going to a RAW photo (in RAW only mode),
- // unless we first pause and resume Open Camera
+ // REVIEW_ACTION means we can view video files without autoplaying.
+ // However, Google Photos at least has problems with going to a RAW photo (in RAW only mode),
+ // unless we first pause and resume Open Camera.
+ // Update: on Galaxy S10e with Android 11 at least, no longer seem to have problems, but leave
+ // the check for is_raw just in case for older devices.
if( MyDebug.LOG )
Log.d(TAG, "try REVIEW_ACTION");
try {
@@ -3279,22 +4227,19 @@ public class MainActivity extends Activity {
if( !done ) {
if( MyDebug.LOG )
Log.d(TAG, "try ACTION_VIEW");
- Intent intent = new Intent(Intent.ACTION_VIEW, uri);
- // see http://stackoverflow.com/questions/11073832/no-activity-found-to-handle-intent - needed to fix crash if no gallery app installed
- //Intent intent = new Intent(Intent.ACTION_VIEW, Uri.parse("blah")); // test
- if( intent.resolveActivity(getPackageManager()) != null ) {
- try {
- this.startActivity(intent);
- }
- catch(SecurityException e2) {
- // have received this crash from Google Play - don't display a toast, simply do nothing
- Log.e(TAG, "SecurityException from ACTION_VIEW startActivity");
- e2.printStackTrace();
- }
+ try {
+ Intent intent = new Intent(Intent.ACTION_VIEW, uri);
+ this.startActivity(intent);
}
- else{
+ catch(ActivityNotFoundException e) {
+ e.printStackTrace();
preview.showToast(null, R.string.no_gallery_app);
}
+ catch(SecurityException e) {
+ // have received this crash from Google Play - don't display a toast, simply do nothing
+ Log.e(TAG, "SecurityException from ACTION_VIEW startActivity");
+ e.printStackTrace();
+ }
}
}
}
@@ -3379,13 +4324,16 @@ public class MainActivity extends Activity {
if (MyDebug.LOG) {
Log.d(TAG, "onActivityResult: " + requestCode);
}
+
+ super.onActivityResult(requestCode, resultCode, resultData);
+
switch( requestCode ) {
case CHOOSE_SAVE_FOLDER_SAF_CODE:
if( resultCode == RESULT_OK && resultData != null ) {
Uri treeUri = resultData.getData();
if( MyDebug.LOG )
Log.d(TAG, "returned treeUri: " + treeUri);
- // see https://developer.android.com/guide/topics/providers/document-provider.html#permissions :
+ // see https://developer.android.com/training/data-storage/shared/documents-files#persist-permissions :
final int takeFlags = resultData.getFlags() & (Intent.FLAG_GRANT_READ_URI_PERMISSION | Intent.FLAG_GRANT_WRITE_URI_PERMISSION);
try {
/*if( true )
@@ -3401,9 +4349,9 @@ public class MainActivity extends Activity {
Log.d(TAG, "update folder history for saf");
updateFolderHistorySAF(treeUri.toString());
- File file = applicationInterface.getStorageUtils().getImageFolder();
+ String file = applicationInterface.getStorageUtils().getImageFolderPath();
if( file != null ) {
- preview.showToast(null, getResources().getString(R.string.changed_save_location) + "\n" + file.getAbsolutePath());
+ preview.showToast(null, getResources().getString(R.string.changed_save_location) + "\n" + file);
}
}
catch(SecurityException e) {
@@ -3450,8 +4398,7 @@ public class MainActivity extends Activity {
Log.d(TAG, "returned single fileUri: " + fileUri);
// persist permission just in case?
final int takeFlags = resultData.getFlags()
- & (Intent.FLAG_GRANT_READ_URI_PERMISSION
- | Intent.FLAG_GRANT_WRITE_URI_PERMISSION);
+ & (Intent.FLAG_GRANT_READ_URI_PERMISSION);
try {
/*if( true )
throw new SecurityException(); // test*/
@@ -3506,8 +4453,7 @@ public class MainActivity extends Activity {
Log.d(TAG, "returned single fileUri: " + fileUri);
// persist permission just in case?
final int takeFlags = resultData.getFlags()
- & (Intent.FLAG_GRANT_READ_URI_PERMISSION
- | Intent.FLAG_GRANT_WRITE_URI_PERMISSION);
+ & (Intent.FLAG_GRANT_READ_URI_PERMISSION);
try {
/*if( true )
throw new SecurityException(); // test*/
@@ -3535,6 +4481,8 @@ public class MainActivity extends Activity {
}
}
+ /** Update the save folder (for non-SAF methods).
+ */
void updateSaveFolder(String new_save_location) {
if( MyDebug.LOG )
Log.d(TAG, "updateSaveFolder: " + new_save_location);
@@ -3550,7 +4498,8 @@ public class MainActivity extends Activity {
editor.apply();
this.save_location_history.updateFolderHistory(this.getStorageUtils().getSaveLocation(), true);
- this.preview.showToast(null, getResources().getString(R.string.changed_save_location) + "\n" + this.applicationInterface.getStorageUtils().getSaveLocation());
+ String save_folder_name = getHumanReadableSaveFolder(this.applicationInterface.getStorageUtils().getSaveLocation());
+ this.preview.showToast(null, getResources().getString(R.string.changed_save_location) + "\n" + save_folder_name);
}
}
}
@@ -3579,6 +4528,84 @@ public class MainActivity extends Activity {
}
}
+ /** Processes a user specified save folder. This should be used with the non-SAF scoped storage
+ * method, where the user types a folder directly.
+ */
+ public static String processUserSaveLocation(String folder) {
+ // filter repeated '/', e.g., replace // with /:
+ String strip = "//";
+ while( folder.length() >= 1 && folder.contains(strip) ) {
+ folder = folder.replaceAll(strip, "/");
+ }
+
+ if( folder.length() >= 1 && folder.charAt(0) == '/' ) {
+ // strip '/' as first character - as absolute paths not allowed with scoped storage
+ // whilst we do block entering a '/' as first character in the InputFilter, users could
+ // get around this (e.g., put a '/' as second character, then delete the first character)
+ folder = folder.substring(1);
+ }
+
+ if( folder.length() >= 1 && folder.charAt(folder.length()-1) == '/' ) {
+ // strip '/' as last character - MediaStore will ignore it, but seems cleaner to strip it out anyway
+ // (we still need to allow '/' as last character in the InputFilter, otherwise users won't be able to type it whilst writing a subfolder)
+ folder = folder.substring(0, folder.length()-1);
+ }
+
+ return folder;
+ }
+
+ /** Creates a dialog builder for specifying a save folder dialog (used when not using SAF,
+ * and on scoped storage, as an alternative to using FolderChooserDialog).
+ */
+ public AlertDialog.Builder createSaveFolderDialog() {
+ final AlertDialog.Builder alertDialog = new AlertDialog.Builder(this);
+ alertDialog.setTitle(R.string.preference_save_location);
+
+ final View dialog_view = LayoutInflater.from(this).inflate(R.layout.alertdialog_edittext, null);
+ final EditText editText = dialog_view.findViewById(R.id.edit_text);
+
+ // set hint instead of content description for EditText, see https://support.google.com/accessibility/android/answer/6378120
+ editText.setHint(getResources().getString(R.string.preference_save_location));
+ editText.setInputType(InputType.TYPE_CLASS_TEXT);
+ SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
+ editText.setText(sharedPreferences.getString(PreferenceKeys.SaveLocationPreferenceKey, "OpenCamera"));
+ InputFilter filter = new InputFilter() {
+ // whilst Android seems to allow any characters on internal memory, SD cards are typically formatted with FAT32
+ final String disallowed = "|\\?*<\":>";
+ public CharSequence filter(CharSequence source, int start, int end, Spanned dest, int dstart, int dend) {
+ for(int i=start;iPermissionsHandler.onRequestPermissionsResult() will be when the application
+ // is still paused - so we won't do anything here, but instead initLocation() will be called after when resuming.
}
else if( camera_in_background ) {
if( MyDebug.LOG )
@@ -4932,6 +6021,7 @@ public class MainActivity extends Activity {
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
if( MyDebug.LOG )
Log.d(TAG, "onRequestPermissionsResult: requestCode " + requestCode);
+ super.onRequestPermissionsResult(requestCode, permissions, grantResults);
permissionHandler.onRequestPermissionsResult(requestCode, grantResults);
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/MyApplicationInterface.java b/app/src/main/java/net/sourceforge/opencamera/MyApplicationInterface.java
index 2726706faa1e5e618befa4e3c9c7bf57e2bee897..22454e4351ae37f48748ffef2348368a33df2c42 100644
--- a/app/src/main/java/net/sourceforge/opencamera/MyApplicationInterface.java
+++ b/app/src/main/java/net/sourceforge/opencamera/MyApplicationInterface.java
@@ -2,6 +2,7 @@ package net.sourceforge.opencamera;
import android.annotation.TargetApi;
import android.app.Activity;
+import android.content.ContentValues;
import android.content.Context;
import android.content.Intent;
import android.content.SharedPreferences;
@@ -10,8 +11,9 @@ import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Rect;
-import android.location.Address;
-import android.location.Geocoder;
+//import android.location.Address; // don't use until we have info for data privacy!
+//import android.location.Geocoder; // don't use until we have info for data privacy!
+import android.hardware.camera2.CameraExtensionCharacteristics;
import android.location.Location;
import android.media.MediaMetadataRetriever;
import android.media.MediaPlayer;
@@ -29,9 +31,12 @@ import android.provider.Settings;
import android.util.Log;
import android.util.Pair;
import android.view.MotionEvent;
+import android.view.Surface;
import android.view.View;
import android.widget.ImageButton;
+import androidx.annotation.RequiresApi;
+
import net.sourceforge.opencamera.cameracontroller.CameraController;
import net.sourceforge.opencamera.cameracontroller.RawImage;
import net.sourceforge.opencamera.preview.ApplicationInterface;
@@ -70,7 +75,13 @@ public class MyApplicationInterface extends BasicApplicationInterface {
FocusBracketing, // take multiple focus bracketed images, without combining to a single image
FastBurst,
NoiseReduction,
- Panorama
+ Panorama,
+ // camera vendor extensions:
+ X_Auto,
+ X_HDR,
+ X_Night,
+ X_Bokeh,
+ X_Beauty
}
private final MainActivity main_activity;
@@ -89,7 +100,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
private boolean panorama_dir_left_to_right = true; // direction of panorama (set after we've captured two images)
private File last_video_file = null;
- private Uri last_video_file_saf = null;
+ private Uri last_video_file_uri = null;
private final Timer subtitleVideoTimer = new Timer();
private TimerTask subtitleVideoTimerTask;
@@ -100,7 +111,12 @@ public class MyApplicationInterface extends BasicApplicationInterface {
// store to avoid calling PreferenceManager.getDefaultSharedPreferences() repeatedly
private final SharedPreferences sharedPreferences;
- private boolean last_images_saf; // whether the last images array are using SAF or not
+ private enum LastImagesType {
+ FILE,
+ SAF,
+ MEDIASTORE
+ }
+ private LastImagesType last_images_type = LastImagesType.FILE; // whether the last images array are using File API, SAF or MediaStore
/** This class keeps track of the images saved in this batch, for use with Pause Preview option, so we can share or trash images.
*/
@@ -143,7 +159,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
private final static float aperture_default = -1.0f;
private float aperture = aperture_default;
// camera properties that aren't saved even in the bundle; these should be initialised/reset in reset()
- private int zoom_factor; // don't save zoom, as doing so tends to confuse users; other camera applications don't seem to save zoom when pause/resuming
+ private int zoom_factor = -1; // don't save zoom, as doing so tends to confuse users; other camera applications don't seem to save zoom when pause/resuming
// for testing:
public volatile int test_n_videos_scanned;
@@ -256,7 +272,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
/** If adding extra calls to this, consider whether explicit user permission is required, and whether
- * privacy policy needs updating.
+ * privacy policy or data privacy section needs updating.
* Returns null if location not available.
*/
@Override
@@ -265,7 +281,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
/** If adding extra calls to this, consider whether explicit user permission is required, and whether
- * privacy policy needs updating.
+ * privacy policy or data privacy section needs updating.
* Returns null if location not available.
*/
public Location getLocation(LocationSupplier.LocationInfo locationInfo) {
@@ -273,9 +289,8 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
@Override
- public int createOutputVideoMethod() {
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ public VideoMethod createOutputVideoMethod() {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
Bundle myExtras = main_activity.getIntent().getExtras();
@@ -284,17 +299,30 @@ public class MyApplicationInterface extends BasicApplicationInterface {
if( intent_uri != null ) {
if( MyDebug.LOG )
Log.d(TAG, "save to: " + intent_uri);
- return VIDEOMETHOD_URI;
+ return VideoMethod.URI;
}
}
// if no EXTRA_OUTPUT, we should save to standard location, and will pass back the Uri of that location
if( MyDebug.LOG )
Log.d(TAG, "intent uri not specified");
- // note that SAF URIs don't seem to work for calling applications (tested with Grabilla and "Photo Grabber Image From Video" (FreezeFrame)), so we use standard folder with non-SAF method
- return VIDEOMETHOD_FILE;
+ if( MainActivity.useScopedStorage() ) {
+ // can't use file method with scoped storage
+ return VideoMethod.MEDIASTORE;
+ }
+ else {
+ // note that SAF URIs don't seem to work for calling applications (tested with Grabilla and "Photo Grabber Image From Video" (FreezeFrame)), so we use standard folder with non-SAF method
+ return VideoMethod.FILE;
+ }
+ }
+ else if( storageUtils.isUsingSAF() ) {
+ return VideoMethod.SAF;
+ }
+ else if( MainActivity.useScopedStorage() ) {
+ return VideoMethod.MEDIASTORE;
+ }
+ else {
+ return VideoMethod.FILE;
}
- boolean using_saf = storageUtils.isUsingSAF();
- return using_saf ? VIDEOMETHOD_SAF : VIDEOMETHOD_FILE;
}
@Override
@@ -305,14 +333,61 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public Uri createOutputVideoSAF(String extension) throws IOException {
- last_video_file_saf = storageUtils.createOutputMediaFileSAF(StorageUtils.MEDIA_TYPE_VIDEO, "", extension, new Date());
- return last_video_file_saf;
+ last_video_file_uri = storageUtils.createOutputMediaFileSAF(StorageUtils.MEDIA_TYPE_VIDEO, "", extension, new Date());
+ return last_video_file_uri;
+ }
+
+ @Override
+ public Uri createOutputVideoMediaStore(String extension) throws IOException {
+ Uri folder = Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ?
+ MediaStore.Video.Media.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY) :
+ MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
+ ContentValues contentValues = new ContentValues();
+ String filename = storageUtils.createMediaFilename(StorageUtils.MEDIA_TYPE_VIDEO, "", 0, "." + extension, new Date());
+ if( MyDebug.LOG )
+ Log.d(TAG, "filename: " + filename);
+ contentValues.put(MediaStore.Video.Media.DISPLAY_NAME, filename);
+ String mime_type = storageUtils.getVideoMimeType(extension);
+ if( MyDebug.LOG )
+ Log.d(TAG, "mime_type: " + mime_type);
+ contentValues.put(MediaStore.Video.Media.MIME_TYPE, mime_type);
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ String relative_path = storageUtils.getSaveRelativeFolder();
+ if( MyDebug.LOG )
+ Log.d(TAG, "relative_path: " + relative_path);
+ contentValues.put(MediaStore.Video.Media.RELATIVE_PATH, relative_path);
+ contentValues.put(MediaStore.Video.Media.IS_PENDING, 1);
+ }
+
+ try {
+ last_video_file_uri = main_activity.getContentResolver().insert(folder, contentValues);
+ if( MyDebug.LOG )
+ Log.d(TAG, "uri: " + last_video_file_uri);
+ }
+ catch(IllegalArgumentException e) {
+ // can happen for mediastore method if invalid ContentResolver.insert() call
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalArgumentException writing video file: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ catch(IllegalStateException e) {
+ // have received Google Play crashes from ContentResolver.insert() call for mediastore method
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalStateException writing video file: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ if( last_video_file_uri == null ) {
+ throw new IOException();
+ }
+
+ return last_video_file_uri;
}
@Override
public Uri createOutputVideoUri() {
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
Bundle myExtras = main_activity.getIntent().getExtras();
@@ -581,6 +656,10 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public boolean getFaceDetectionPref() {
+ if( isCameraExtensionPref() ) {
+ // not supported for camera extensions
+ return false;
+ }
return sharedPreferences.getBoolean(PreferenceKeys.FaceDetectionPreferenceKey, false);
}
@@ -593,8 +672,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public String getVideoQualityPref() {
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
if( main_activity.getIntent().hasExtra(MediaStore.EXTRA_VIDEO_QUALITY) ) {
@@ -647,8 +725,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public String getVideoFPSPref() {
// if check for EXTRA_VIDEO_QUALITY, if set, best to fall back to default FPS - see corresponding code in getVideoQualityPref
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
if( main_activity.getIntent().hasExtra(MediaStore.EXTRA_VIDEO_QUALITY) ) {
@@ -847,15 +924,14 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public long getVideoMaxDurationPref() {
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
if( main_activity.getIntent().hasExtra(MediaStore.EXTRA_DURATION_LIMIT) ) {
int intent_duration_limit = main_activity.getIntent().getIntExtra(MediaStore.EXTRA_DURATION_LIMIT, 0);
if( MyDebug.LOG )
Log.d(TAG, "intent_duration_limit: " + intent_duration_limit);
- return intent_duration_limit * 1000;
+ return intent_duration_limit * 1000L;
}
}
@@ -893,8 +969,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
if( MyDebug.LOG )
Log.d(TAG, "getVideoMaxFileSizeUserPref");
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
if( main_activity.getIntent().hasExtra(MediaStore.EXTRA_SIZE_LIMIT) ) {
@@ -916,14 +991,14 @@ public class MyApplicationInterface extends BasicApplicationInterface {
e.printStackTrace();
video_max_filesize = 0;
}
+ //video_max_filesize = 1024*1024; // test
if( MyDebug.LOG )
Log.d(TAG, "video_max_filesize: " + video_max_filesize);
return video_max_filesize;
}
private boolean getVideoRestartMaxFileSizeUserPref() {
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( isVideoCaptureIntent() ) {
if( MyDebug.LOG )
Log.d(TAG, "from video capture intent");
if( main_activity.getIntent().hasExtra(MediaStore.EXTRA_SIZE_LIMIT) ) {
@@ -958,11 +1033,11 @@ public class MyApplicationInterface extends BasicApplicationInterface {
if( MyDebug.LOG )
Log.d(TAG, "saving to: " + folder_name);
boolean is_internal = false;
- if( !folder_name.startsWith("/") ) {
+ if( !StorageUtils.saveFolderIsFull(folder_name) ) {
is_internal = true;
}
else {
- // if save folder path is a full path, see if it matches the "external" storage (which actually means "primary", which typically isn't an SD card these days)
+ // If save folder path is a full path, see if it matches the "external" storage (which actually means "primary", which typically isn't an SD card these days).
File storage = Environment.getExternalStorageDirectory();
if( MyDebug.LOG )
Log.d(TAG, "compare to: " + storage.getAbsolutePath());
@@ -1032,11 +1107,6 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return sharedPreferences.getString(PreferenceKeys.PreviewSizePreferenceKey, "preference_preview_size_wysiwyg");
}
- @Override
- public String getPreviewRotationPref() {
- return sharedPreferences.getString(PreferenceKeys.RotatePreviewPreferenceKey, "0");
- }
-
@Override
public String getLockOrientationPref() {
if( getPhotoMode() == PhotoMode.Panorama )
@@ -1210,9 +1280,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return sharedPreferences.getString(PreferenceKeys.StampGPSFormatPreferenceKey, "preference_stamp_gpsformat_default");
}
- private String getStampGeoAddressPref() {
+ /*private String getStampGeoAddressPref() {
return sharedPreferences.getString(PreferenceKeys.StampGeoAddressPreferenceKey, "preference_stamp_geo_address_no");
- }
+ }*/
private String getUnitsDistancePref() {
return sharedPreferences.getString(PreferenceKeys.UnitsDistancePreferenceKey, "preference_units_distance_m");
@@ -1239,7 +1309,17 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return font_size;
}
- private String getVideoSubtitlePref() {
+ /** Whether the Mediastore API supports saving subtitle files.
+ */
+ static boolean mediastoreSupportsVideoSubtitles() {
+ // Android 11+ no longer allows mediastore API to save types that Android doesn't support!
+ return Build.VERSION.SDK_INT < Build.VERSION_CODES.R;
+ }
+
+ private String getVideoSubtitlePref(VideoMethod video_method) {
+ if( video_method == VideoMethod.MEDIASTORE && !mediastoreSupportsVideoSubtitles() ) {
+ return "preference_video_subtitle_no";
+ }
return sharedPreferences.getString(PreferenceKeys.VideoSubtitlePref, "preference_video_subtitle_no");
}
@@ -1368,6 +1448,34 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return imageSaver.queueWouldBlock(n_raw, n_jpegs);
}
+ /** Returns the ROTATION_* enum of the display relative to the natural device orientation, but
+ * also checks for the preview being rotated due to user preference
+ * RotatePreviewPreferenceKey.
+ */
+ @Override
+ public int getDisplayRotation() {
+ // important to use cached rotation to reduce issues of incorrect focus square location when
+ // rotating device, due to strange Android behaviour where rotation changes shortly before
+ // the configuration actually changes
+ int rotation = main_activity.getDisplayRotation();
+
+ String rotate_preview = sharedPreferences.getString(PreferenceKeys.RotatePreviewPreferenceKey, "0");
+ if( MyDebug.LOG )
+ Log.d(TAG, " rotate_preview = " + rotate_preview);
+ if( rotate_preview.equals("180") ) {
+ switch (rotation) {
+ case Surface.ROTATION_0: rotation = Surface.ROTATION_180; break;
+ case Surface.ROTATION_90: rotation = Surface.ROTATION_270; break;
+ case Surface.ROTATION_180: rotation = Surface.ROTATION_0; break;
+ case Surface.ROTATION_270: rotation = Surface.ROTATION_90; break;
+ default:
+ break;
+ }
+ }
+
+ return rotation;
+ }
+
@Override
public long getExposureTimePref() {
return sharedPreferences.getLong(PreferenceKeys.ExposureTimePreferenceKey, CameraController.EXPOSURE_TIME_DEFAULT);
@@ -1444,6 +1552,34 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return NRModePref.NRMODE_NORMAL;
}
+ @Override
+ public boolean isCameraExtensionPref() {
+ PhotoMode photo_mode = getPhotoMode();
+ return photo_mode == PhotoMode.X_Auto || photo_mode == PhotoMode.X_HDR || photo_mode == PhotoMode.X_Night || photo_mode == PhotoMode.X_Bokeh || photo_mode == PhotoMode.X_Beauty;
+ }
+
+ @Override
+ @RequiresApi(api = Build.VERSION_CODES.S)
+ public int getCameraExtensionPref() {
+ PhotoMode photo_mode = getPhotoMode();
+ if( photo_mode == PhotoMode.X_Auto ) {
+ return CameraExtensionCharacteristics.EXTENSION_AUTOMATIC;
+ }
+ else if( photo_mode == PhotoMode.X_HDR ) {
+ return CameraExtensionCharacteristics.EXTENSION_HDR;
+ }
+ else if( photo_mode == PhotoMode.X_Night ) {
+ return CameraExtensionCharacteristics.EXTENSION_NIGHT;
+ }
+ else if( photo_mode == PhotoMode.X_Bokeh ) {
+ return CameraExtensionCharacteristics.EXTENSION_BOKEH;
+ }
+ else if( photo_mode == PhotoMode.X_Beauty ) {
+ return CameraExtensionCharacteristics.EXTENSION_BEAUTY;
+ }
+ return 0;
+ }
+
public void setAperture(float aperture) {
this.aperture = aperture;
}
@@ -1559,6 +1695,21 @@ public class MyApplicationInterface extends BasicApplicationInterface {
boolean panorama = photo_mode_pref.equals("preference_photo_mode_panorama");
if( panorama && !main_activity.getPreview().isVideo() && main_activity.supportsPanorama() )
return PhotoMode.Panorama;
+ boolean x_auto = photo_mode_pref.equals("preference_photo_mode_x_auto");
+ if( x_auto && !main_activity.getPreview().isVideo() && main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_AUTOMATIC) )
+ return PhotoMode.X_Auto;
+ boolean x_hdr = photo_mode_pref.equals("preference_photo_mode_x_hdr");
+ if( x_hdr && !main_activity.getPreview().isVideo() && main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_HDR) )
+ return PhotoMode.X_HDR;
+ boolean x_night = photo_mode_pref.equals("preference_photo_mode_x_night");
+ if( x_night && !main_activity.getPreview().isVideo() && main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_NIGHT) )
+ return PhotoMode.X_Night;
+ boolean x_bokeh = photo_mode_pref.equals("preference_photo_mode_x_bokeh");
+ if( x_bokeh && !main_activity.getPreview().isVideo() && main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_BOKEH) )
+ return PhotoMode.X_Bokeh;
+ boolean x_beauty = photo_mode_pref.equals("preference_photo_mode_x_beauty");
+ if( x_beauty && !main_activity.getPreview().isVideo() && main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_BEAUTY) )
+ return PhotoMode.X_Beauty;
return PhotoMode.Standard;
}
@@ -1607,6 +1758,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
main_activity.supportsBurstRaw();
}
// not supported for panorama mode
+ // not supported for camera vendor extensions
return false;
}
@@ -1659,6 +1811,11 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return sharedPreferences.getBoolean(PreferenceKeys.Camera2FakeFlashPreferenceKey, false);
}
+ @Override
+ public boolean useCamera2DummyCaptureHack() {
+ return sharedPreferences.getBoolean(PreferenceKeys.Camera2DummyCaptureHackPreferenceKey, false);
+ }
+
@Override
public boolean useCamera2FastBurst() {
return sharedPreferences.getBoolean(PreferenceKeys.Camera2FastBurstPreferenceKey, true);
@@ -1679,8 +1836,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public boolean allowZoom() {
- if( getPhotoMode() == PhotoMode.Panorama ) {
+ if( getPhotoMode() == PhotoMode.Panorama || isCameraExtensionPref() ) {
// don't allow zooming in panorama mode, the algorithm isn't set up to support this!
+ // zoom also not supported for camera extensions
return false;
}
return true;
@@ -1795,6 +1953,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
if( MyDebug.LOG )
Log.d(TAG, "setNextPanoramaPoint : " + x + " , " + y + " , " + z);
+ @SuppressWarnings("PointlessArithmeticExpression")
final float target_angle = 1.0f * 0.01745329252f;
//final float target_angle = 0.5f * 0.01745329252f;
final float upright_angle_tol = 2.0f * 0.017452406437f;
@@ -1853,6 +2012,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
public void touchEvent(MotionEvent event) {
main_activity.getMainUI().closeExposureUI();
main_activity.getMainUI().closePopup();
+ if( main_activity.usingKitKatImmersiveMode() ) {
+ main_activity.setImmersiveMode(false);
+ }
}
@Override
@@ -1868,236 +2030,316 @@ public class MyApplicationInterface extends BasicApplicationInterface {
main_activity.getMainUI().destroyPopup(); // as the available popup options change while recording video
}
- @Override
- public void startedVideo() {
- if( MyDebug.LOG )
- Log.d(TAG, "startedVideo()");
- if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
- View pauseVideoButton = main_activity.findViewById(R.id.pause_video);
- pauseVideoButton.setVisibility(View.VISIBLE);
- main_activity.getMainUI().setPauseVideoContentDescription();
- }
- if (main_activity.getPreview().supportsPhotoVideoRecording() && this.usePhotoVideoRecording()) {
- View takePhotoVideoButton = main_activity.findViewById(R.id.take_photo_when_video_recording);
- takePhotoVideoButton.setVisibility(View.VISIBLE);
- }
- if( main_activity.getMainUI().isExposureUIOpen() ) {
- if( MyDebug.LOG )
- Log.d(TAG, "need to update exposure UI for start video recording");
- // need to update the exposure UI when starting/stopping video recording, to remove/add
- // ability to switch between auto and manual
- main_activity.getMainUI().setupExposureUI();
- }
+ private void startVideoSubtitlesTask(final VideoMethod video_method) {
+ final String preference_stamp_dateformat = this.getStampDateFormatPref();
+ final String preference_stamp_timeformat = this.getStampTimeFormatPref();
+ final String preference_stamp_gpsformat = this.getStampGPSFormatPref();
+ final String preference_units_distance = this.getUnitsDistancePref();
+ //final String preference_stamp_geo_address = this.getStampGeoAddressPref();
+ final boolean store_location = getGeotaggingPref();
+ final boolean store_geo_direction = getGeodirectionPref();
+ class SubtitleVideoTimerTask extends TimerTask {
+ // need to keep a reference to pfd_saf for as long as writer, to avoid getting garbage collected - see https://sourceforge.net/p/opencamera/tickets/417/
+ private ParcelFileDescriptor pfd_saf;
+ private OutputStreamWriter writer;
+ private Uri uri;
+ private int count = 1;
+ private long min_video_time_from = 0;
+
+ private String getSubtitleFilename(String video_filename) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "getSubtitleFilename");
+ int indx = video_filename.indexOf('.');
+ if( indx != -1 ) {
+ video_filename = video_filename.substring(0, indx);
+ }
+ video_filename = video_filename + ".srt";
+ if( MyDebug.LOG )
+ Log.d(TAG, "return filename: " + video_filename);
+ return video_filename;
+ }
- ImageButton view = main_activity.findViewById(R.id.take_photo);
- view.setImageResource(R.drawable.ic_camera_video_recording);
- final int video_method = this.createOutputVideoMethod();
- boolean dategeo_subtitles = getVideoSubtitlePref().equals("preference_video_subtitle_yes");
- if( dategeo_subtitles && video_method != ApplicationInterface.VIDEOMETHOD_URI ) {
- final String preference_stamp_dateformat = this.getStampDateFormatPref();
- final String preference_stamp_timeformat = this.getStampTimeFormatPref();
- final String preference_stamp_gpsformat = this.getStampGPSFormatPref();
- final String preference_units_distance = this.getUnitsDistancePref();
- final String preference_stamp_geo_address = this.getStampGeoAddressPref();
- final boolean store_location = getGeotaggingPref();
- final boolean store_geo_direction = getGeodirectionPref();
- class SubtitleVideoTimerTask extends TimerTask {
- OutputStreamWriter writer;
- private int count = 1;
- private long min_video_time_from = 0;
-
- private String getSubtitleFilename(String video_filename) {
- if( MyDebug.LOG )
- Log.d(TAG, "getSubtitleFilename");
- int indx = video_filename.indexOf('.');
- if( indx != -1 ) {
- video_filename = video_filename.substring(0, indx);
- }
- video_filename = video_filename + ".srt";
+ public void run() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "SubtitleVideoTimerTask run");
+ long video_time = main_activity.getPreview().getVideoTime(true); // n.b., in case of restarts due to max filesize, we only want the time for this video file!
+ if( !main_activity.getPreview().isVideoRecording() ) {
if( MyDebug.LOG )
- Log.d(TAG, "return filename: " + video_filename);
- return video_filename;
+ Log.d(TAG, "no longer video recording");
+ return;
}
-
- public void run() {
+ if( main_activity.getPreview().isVideoRecordingPaused() ) {
if( MyDebug.LOG )
- Log.d(TAG, "SubtitleVideoTimerTask run");
- long video_time = main_activity.getPreview().getVideoTime();
- if( !main_activity.getPreview().isVideoRecording() ) {
- if( MyDebug.LOG )
- Log.d(TAG, "no longer video recording");
- return;
- }
- if( main_activity.getPreview().isVideoRecordingPaused() ) {
- if( MyDebug.LOG )
- Log.d(TAG, "video recording is paused");
- return;
- }
- Date current_date = new Date();
- Calendar current_calendar = Calendar.getInstance();
- int offset_ms = current_calendar.get(Calendar.MILLISECOND);
- // We subtract an offset, because if the current time is say 00:00:03.425 and the video has been recording for
- // 1s, we instead need to record the video time when it became 00:00:03.000. This does mean that the GPS
- // location is going to be off by up to 1s, but that should be less noticeable than the clock being off.
- if( MyDebug.LOG ) {
- Log.d(TAG, "count: " + count);
- Log.d(TAG, "offset_ms: " + offset_ms);
- Log.d(TAG, "video_time: " + video_time);
- }
- String date_stamp = TextFormatter.getDateString(preference_stamp_dateformat, current_date);
- String time_stamp = TextFormatter.getTimeString(preference_stamp_timeformat, current_date);
- Location location = store_location ? getLocation() : null;
- double geo_direction = store_geo_direction && main_activity.getPreview().hasGeoDirection() ? main_activity.getPreview().getGeoDirection() : 0.0;
- String gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, preference_units_distance, store_location && location!=null, location, store_geo_direction && main_activity.getPreview().hasGeoDirection(), geo_direction);
- if( MyDebug.LOG ) {
- Log.d(TAG, "date_stamp: " + date_stamp);
- Log.d(TAG, "time_stamp: " + time_stamp);
- // don't log gps_stamp, in case of privacy!
- }
-
- String datetime_stamp = "";
- if( date_stamp.length() > 0 )
- datetime_stamp += date_stamp;
- if( time_stamp.length() > 0 ) {
- if( datetime_stamp.length() > 0 )
- datetime_stamp += " ";
- datetime_stamp += time_stamp;
- }
+ Log.d(TAG, "video recording is paused");
+ return;
+ }
+ Date current_date = new Date();
+ Calendar current_calendar = Calendar.getInstance();
+ int offset_ms = current_calendar.get(Calendar.MILLISECOND);
+ // We subtract an offset, because if the current time is say 00:00:03.425 and the video has been recording for
+ // 1s, we instead need to record the video time when it became 00:00:03.000. This does mean that the GPS
+ // location is going to be off by up to 1s, but that should be less noticeable than the clock being off.
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "count: " + count);
+ Log.d(TAG, "offset_ms: " + offset_ms);
+ Log.d(TAG, "video_time: " + video_time);
+ }
+ String date_stamp = TextFormatter.getDateString(preference_stamp_dateformat, current_date);
+ String time_stamp = TextFormatter.getTimeString(preference_stamp_timeformat, current_date);
+ Location location = store_location ? getLocation() : null;
+ double geo_direction = store_geo_direction && main_activity.getPreview().hasGeoDirection() ? main_activity.getPreview().getGeoDirection() : 0.0;
+ String gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, preference_units_distance, store_location && location!=null, location, store_geo_direction && main_activity.getPreview().hasGeoDirection(), geo_direction);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "date_stamp: " + date_stamp);
+ Log.d(TAG, "time_stamp: " + time_stamp);
+ // don't log gps_stamp, in case of privacy!
+ }
- // build subtitles
- StringBuilder subtitles = new StringBuilder();
+ String datetime_stamp = "";
+ if( date_stamp.length() > 0 )
+ datetime_stamp += date_stamp;
+ if( time_stamp.length() > 0 ) {
if( datetime_stamp.length() > 0 )
- subtitles.append(datetime_stamp).append("\n");
+ datetime_stamp += " ";
+ datetime_stamp += time_stamp;
+ }
- if( gps_stamp.length() > 0 ) {
- Address address = null;
- if( store_location && !preference_stamp_geo_address.equals("preference_stamp_geo_address_no") ) {
- // try to find an address
- if( Geocoder.isPresent() ) {
- if( MyDebug.LOG )
- Log.d(TAG, "geocoder is present");
- Geocoder geocoder = new Geocoder(main_activity, Locale.getDefault());
- try {
- List addresses = geocoder.getFromLocation(location.getLatitude(), location.getLongitude(), 1);
- if( addresses != null && addresses.size() > 0 ) {
- address = addresses.get(0);
- // don't log address, in case of privacy!
- if( MyDebug.LOG ) {
- Log.d(TAG, "max line index: " + address.getMaxAddressLineIndex());
- }
+ // build subtitles
+ StringBuilder subtitles = new StringBuilder();
+ if( datetime_stamp.length() > 0 )
+ subtitles.append(datetime_stamp).append("\n");
+
+ if( gps_stamp.length() > 0 ) {
+ /*Address address = null;
+ if( store_location && !preference_stamp_geo_address.equals("preference_stamp_geo_address_no") ) {
+ // try to find an address
+ if( main_activity.isAppPaused() ) {
+ // seems safer to not try to initiate potential network connections (via geocoder) if Open Camera
+ // is paused - this shouldn't happen, since we stop video when paused, but just to be safe
+ if( MyDebug.LOG )
+ Log.d(TAG, "don't call geocoder for video subtitles as app is paused?!");
+ }
+ else if( Geocoder.isPresent() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "geocoder is present");
+ Geocoder geocoder = new Geocoder(main_activity, Locale.getDefault());
+ try {
+ List addresses = geocoder.getFromLocation(location.getLatitude(), location.getLongitude(), 1);
+ if( addresses != null && addresses.size() > 0 ) {
+ address = addresses.get(0);
+ // don't log address, in case of privacy!
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "max line index: " + address.getMaxAddressLineIndex());
}
}
- catch(Exception e) {
- Log.e(TAG, "failed to read from geocoder");
- e.printStackTrace();
- }
}
- else {
- if( MyDebug.LOG )
- Log.d(TAG, "geocoder not present");
+ catch(Exception e) {
+ Log.e(TAG, "failed to read from geocoder");
+ e.printStackTrace();
}
}
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "geocoder not present");
+ }
+ }
- if( address != null ) {
- for(int i=0;i<=address.getMaxAddressLineIndex();i++) {
- // write in forward order
- String addressLine = address.getAddressLine(i);
- subtitles.append(addressLine).append("\n");
- }
+ if( address != null ) {
+ for(int i=0;i<=address.getMaxAddressLineIndex();i++) {
+ // write in forward order
+ String addressLine = address.getAddressLine(i);
+ subtitles.append(addressLine).append("\n");
}
+ }*/
- if( address == null || preference_stamp_geo_address.equals("preference_stamp_geo_address_both") ) {
- if( MyDebug.LOG )
- Log.d(TAG, "display gps coords");
+ //if( address == null || preference_stamp_geo_address.equals("preference_stamp_geo_address_both") )
+ {
+ if( MyDebug.LOG )
+ Log.d(TAG, "display gps coords");
+ subtitles.append(gps_stamp).append("\n");
+ }
+ /*else if( store_geo_direction ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "not displaying gps coords, but need to display geo direction");
+ gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, preference_units_distance, false, null, store_geo_direction && main_activity.getPreview().hasGeoDirection(), geo_direction);
+ if( gps_stamp.length() > 0 ) {
+ // don't log gps_stamp, in case of privacy!
subtitles.append(gps_stamp).append("\n");
}
- else if( store_geo_direction ) {
- if( MyDebug.LOG )
- Log.d(TAG, "not displaying gps coords, but need to display geo direction");
- gps_stamp = main_activity.getTextFormatter().getGPSString(preference_stamp_gpsformat, preference_units_distance, false, null, store_geo_direction && main_activity.getPreview().hasGeoDirection(), geo_direction);
- if( gps_stamp.length() > 0 ) {
- // don't log gps_stamp, in case of privacy!
- subtitles.append(gps_stamp).append("\n");
- }
- }
- }
+ }*/
+ }
- if( subtitles.length() == 0 ) {
- return;
- }
- long video_time_from = video_time - offset_ms;
- long video_time_to = video_time_from + 999;
- // don't want to start from before 0; also need to keep track of min_video_time_from to avoid bug reported at
- // https://forum.xda-developers.com/showpost.php?p=74827802&postcount=345 for pause video where we ended up
- // with overlapping times when resuming
- if( video_time_from < min_video_time_from )
- video_time_from = min_video_time_from;
- min_video_time_from = video_time_to + 1;
- String subtitle_time_from = TextFormatter.formatTimeMS(video_time_from);
- String subtitle_time_to = TextFormatter.formatTimeMS(video_time_to);
- try {
- synchronized( this ) {
- if( writer == null ) {
- if( video_method == ApplicationInterface.VIDEOMETHOD_FILE ) {
- String subtitle_filename = last_video_file.getAbsolutePath();
- subtitle_filename = getSubtitleFilename(subtitle_filename);
- writer = new FileWriter(subtitle_filename);
+ if( subtitles.length() == 0 ) {
+ return;
+ }
+ long video_time_from = video_time - offset_ms;
+ long video_time_to = video_time_from + 999;
+ // don't want to start from before 0; also need to keep track of min_video_time_from to avoid bug reported at
+ // https://forum.xda-developers.com/showpost.php?p=74827802&postcount=345 for pause video where we ended up
+ // with overlapping times when resuming
+ if( video_time_from < min_video_time_from )
+ video_time_from = min_video_time_from;
+ min_video_time_from = video_time_to + 1;
+ String subtitle_time_from = TextFormatter.formatTimeMS(video_time_from);
+ String subtitle_time_to = TextFormatter.formatTimeMS(video_time_to);
+ try {
+ synchronized( this ) {
+ if( writer == null ) {
+ if( video_method == VideoMethod.FILE ) {
+ String subtitle_filename = last_video_file.getAbsolutePath();
+ subtitle_filename = getSubtitleFilename(subtitle_filename);
+ writer = new FileWriter(subtitle_filename);
+ }
+ else if( video_method == VideoMethod.SAF || video_method == VideoMethod.MEDIASTORE ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "last_video_file_uri: " + last_video_file_uri);
+ String subtitle_filename = storageUtils.getFileName(last_video_file_uri);
+ subtitle_filename = getSubtitleFilename(subtitle_filename);
+ if( video_method == VideoMethod.SAF ) {
+ uri = storageUtils.createOutputFileSAF(subtitle_filename, ""); // don't set a mimetype, as we don't want it to append a new extension
}
else {
- if( MyDebug.LOG )
- Log.d(TAG, "last_video_file_saf: " + last_video_file_saf);
- String subtitle_filename = storageUtils.getFileName(last_video_file_saf);
- subtitle_filename = getSubtitleFilename(subtitle_filename);
- Uri subtitle_uri = storageUtils.createOutputFileSAF(subtitle_filename, ""); // don't set a mimetype, as we don't want it to append a new extension
- ParcelFileDescriptor pfd_saf = getContext().getContentResolver().openFileDescriptor(subtitle_uri, "w");
- writer = new FileWriter(pfd_saf.getFileDescriptor());
+ Uri folder = Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ?
+ MediaStore.Video.Media.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY) :
+ MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
+ ContentValues contentValues = new ContentValues();
+ contentValues.put(MediaStore.Video.Media.DISPLAY_NAME, subtitle_filename);
+ // set mime type - it's unclear if .SRT files have an official mime type, but (a) we must set a mime type otherwise
+ // resultant files are named "*.srt.mp4", and (b) the mime type must be video/*, otherwise we get exception:
+ // "java.lang.IllegalArgumentException: MIME type text/plain cannot be inserted into content://media/external_primary/video/media; expected MIME type under video/*"
+ // and we need the file to be saved in the same folder (in DCIM/ ) as the video
+ contentValues.put(MediaStore.Images.Media.MIME_TYPE, "video/x-srt");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ String relative_path = storageUtils.getSaveRelativeFolder();
+ if( MyDebug.LOG )
+ Log.d(TAG, "relative_path: " + relative_path);
+ contentValues.put(MediaStore.Video.Media.RELATIVE_PATH, relative_path);
+ contentValues.put(MediaStore.Video.Media.IS_PENDING, 1);
+ }
+
+ // Note, we catch exceptions specific to insert() here and rethrow as IOException,
+ // rather than catching below, to avoid catching things too broadly.
+ // Catching too broadly could mean we miss genuine problems that should be fixed.
+ try {
+ uri = main_activity.getContentResolver().insert(folder, contentValues);
+ }
+ catch(IllegalArgumentException e) {
+ // can happen for mediastore method if invalid ContentResolver.insert() call
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalArgumentException from SubtitleVideoTimerTask inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ catch(IllegalStateException e) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "IllegalStateException from SubtitleVideoTimerTask inserting to mediastore: " + e.getMessage());
+ e.printStackTrace();
+ throw new IOException();
+ }
+ if( uri == null ) {
+ throw new IOException();
+ }
}
- }
- if( writer != null ) {
- writer.append(Integer.toString(count));
- writer.append('\n');
- writer.append(subtitle_time_from);
- writer.append(" --> ");
- writer.append(subtitle_time_to);
- writer.append('\n');
- writer.append(subtitles.toString()); // subtitles should include the '\n' at the end
- writer.append('\n'); // additional newline to indicate end of this subtitle
- writer.flush();
- // n.b., we flush rather than closing/reopening the writer each time, as appending doesn't seem to work with storage access framework
+ if( MyDebug.LOG )
+ Log.d(TAG, "uri: " + uri);
+ pfd_saf = getContext().getContentResolver().openFileDescriptor(uri, "w");
+ writer = new FileWriter(pfd_saf.getFileDescriptor());
}
}
- count++;
- }
- catch(IOException e) {
- if( MyDebug.LOG )
- Log.e(TAG, "SubtitleVideoTimerTask failed to create or write");
- e.printStackTrace();
+ if( writer != null ) {
+ writer.append(Integer.toString(count));
+ writer.append('\n');
+ writer.append(subtitle_time_from);
+ writer.append(" --> ");
+ writer.append(subtitle_time_to);
+ writer.append('\n');
+ writer.append(subtitles.toString()); // subtitles should include the '\n' at the end
+ writer.append('\n'); // additional newline to indicate end of this subtitle
+ writer.flush();
+ // n.b., we flush rather than closing/reopening the writer each time, as appending doesn't seem to work with storage access framework
+ }
}
+ count++;
+ }
+ catch(IOException e) {
if( MyDebug.LOG )
- Log.d(TAG, "SubtitleVideoTimerTask exit");
+ Log.e(TAG, "SubtitleVideoTimerTask failed to create or write");
+ e.printStackTrace();
}
+ if( MyDebug.LOG )
+ Log.d(TAG, "SubtitleVideoTimerTask exit");
+ }
- public boolean cancel() {
- if( MyDebug.LOG )
- Log.d(TAG, "SubtitleVideoTimerTask cancel");
- synchronized( this ) {
- if( writer != null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "close writer");
- try {
- writer.close();
- }
- catch(IOException e) {
- e.printStackTrace();
- }
- writer = null;
+ public boolean cancel() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "SubtitleVideoTimerTask cancel");
+ synchronized( this ) {
+ if( writer != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "close writer");
+ try {
+ writer.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ writer = null;
+ }
+ if( pfd_saf != null ) {
+ try {
+ pfd_saf.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ pfd_saf = null;
+ }
+ if( video_method == VideoMethod.MEDIASTORE ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ ContentValues contentValues = new ContentValues();
+ contentValues.put(MediaStore.Video.Media.IS_PENDING, 0);
+ main_activity.getContentResolver().update(uri, contentValues, null, null);
}
}
- return super.cancel();
}
+ return super.cancel();
}
- subtitleVideoTimer.schedule(subtitleVideoTimerTask = new SubtitleVideoTimerTask(), 0, 1000);
}
+ subtitleVideoTimer.schedule(subtitleVideoTimerTask = new SubtitleVideoTimerTask(), 0, 1000);
+ }
+
+ @Override
+ public void startedVideo() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "startedVideo()");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
+ if( !( main_activity.getMainUI().inImmersiveMode() && main_activity.usingKitKatImmersiveModeEverything() ) ) {
+ View pauseVideoButton = main_activity.findViewById(R.id.pause_video);
+ pauseVideoButton.setVisibility(View.VISIBLE);
+ }
+ main_activity.getMainUI().setPauseVideoContentDescription();
+ }
+ if( main_activity.getPreview().supportsPhotoVideoRecording() && this.usePhotoVideoRecording() ) {
+ if( !( main_activity.getMainUI().inImmersiveMode() && main_activity.usingKitKatImmersiveModeEverything() ) ) {
+ View takePhotoVideoButton = main_activity.findViewById(R.id.take_photo_when_video_recording);
+ takePhotoVideoButton.setVisibility(View.VISIBLE);
+ }
+ }
+ if( main_activity.getMainUI().isExposureUIOpen() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "need to update exposure UI for start video recording");
+ // need to update the exposure UI when starting/stopping video recording, to remove/add
+ // ability to switch between auto and manual
+ main_activity.getMainUI().setupExposureUI();
+ }
+ final VideoMethod video_method = this.createOutputVideoMethod();
+ boolean dategeo_subtitles = getVideoSubtitlePref(video_method).equals("preference_video_subtitle_yes");
+ if( dategeo_subtitles && video_method != ApplicationInterface.VideoMethod.URI ) {
+ startVideoSubtitlesTask(video_method);
+ }
+
+ ImageButton view = main_activity.findViewById(R.id.take_photo);
+ view.setImageResource(R.drawable.ic_camera_video_recording);
}
@Override
@@ -2112,7 +2354,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
@Override
- public void stoppedVideo(final int video_method, final Uri uri, final String filename) {
+ public void stoppedVideo(final VideoMethod video_method, final Uri uri, final String filename) {
if( MyDebug.LOG ) {
Log.d(TAG, "stoppedVideo");
Log.d(TAG, "video_method " + video_method);
@@ -2137,13 +2379,13 @@ public class MyApplicationInterface extends BasicApplicationInterface {
subtitleVideoTimerTask = null;
}
+ completeVideo(video_method, uri);
boolean done = broadcastVideo(video_method, uri, filename);
if( MyDebug.LOG )
Log.d(TAG, "done? " + done);
- String action = main_activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
- if( done && video_method == VIDEOMETHOD_FILE ) {
+ if( isVideoCaptureIntent() ) {
+ if( done && video_method == VideoMethod.FILE ) {
// do nothing here - we end the activity from storageUtils.broadcastFile after the file has been scanned, as it seems caller apps seem to prefer the content:// Uri rather than one based on a File
}
else {
@@ -2152,9 +2394,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
Intent output = null;
if( done ) {
// may need to pass back the Uri we saved to, if the calling application didn't specify a Uri
- // set note above for VIDEOMETHOD_FILE
- // n.b., currently this code is not used, as we always switch to VIDEOMETHOD_FILE if the calling application didn't specify a Uri, but I've left this here for possible future behaviour
- if( video_method == VIDEOMETHOD_SAF ) {
+ // set note above for VideoMethod.FILE
+ // n.b., currently this code is not used, as we always switch to VideoMethod.FILE if the calling application didn't specify a Uri, but I've left this here for possible future behaviour
+ if( video_method == VideoMethod.SAF || video_method == VideoMethod.MEDIASTORE ) {
output = new Intent();
output.setData(uri);
if( MyDebug.LOG )
@@ -2169,14 +2411,15 @@ public class MyApplicationInterface extends BasicApplicationInterface {
// create thumbnail
long debug_time = System.currentTimeMillis();
Bitmap thumbnail = null;
+ ParcelFileDescriptor pfd_saf = null; // keep a reference to this as long as retriever, to avoid risk of pfd_saf being garbage collected
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
- if( video_method == VIDEOMETHOD_FILE ) {
+ if( video_method == VideoMethod.FILE ) {
File file = new File(filename);
retriever.setDataSource(file.getPath());
}
else {
- ParcelFileDescriptor pfd_saf = getContext().getContentResolver().openFileDescriptor(uri, "r");
+ pfd_saf = getContext().getContentResolver().openFileDescriptor(uri, "r");
retriever.setDataSource(pfd_saf.getFileDescriptor());
}
thumbnail = retriever.getFrameAtTime(-1);
@@ -2193,6 +2436,14 @@ public class MyApplicationInterface extends BasicApplicationInterface {
catch(RuntimeException ex) {
// ignore
}
+ try {
+ if( pfd_saf != null ) {
+ pfd_saf.close();
+ }
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
}
if( thumbnail != null ) {
CircleImageView galleryButton = main_activity.findViewById(R.id.gallery);
@@ -2226,17 +2477,43 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
@Override
- public void restartedVideo(final int video_method, final Uri uri, final String filename) {
+ public void restartedVideo(final VideoMethod video_method, final Uri uri, final String filename) {
if( MyDebug.LOG ) {
Log.d(TAG, "restartedVideo");
Log.d(TAG, "video_method " + video_method);
Log.d(TAG, "uri " + uri);
Log.d(TAG, "filename " + filename);
}
+ completeVideo(video_method, uri);
broadcastVideo(video_method, uri, filename);
+
+ // also need to restart subtitles file
+ if( subtitleVideoTimerTask != null ) {
+ subtitleVideoTimerTask.cancel();
+ subtitleVideoTimerTask = null;
+
+ // No need to check if option for subtitles is set, if we were already saving subtitles.
+ // Assume that video_method is unchanged between old and new video file when restarting.
+ startVideoSubtitlesTask(video_method);
+ }
+ }
+
+ /** Called when we've finished recording to a video file, to do any necessary cleanup for the
+ * file.
+ */
+ private void completeVideo(final VideoMethod video_method, final Uri uri) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "completeVideo");
+ if( video_method == VideoMethod.MEDIASTORE ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ ContentValues contentValues = new ContentValues();
+ contentValues.put(MediaStore.Video.Media.IS_PENDING, 0);
+ main_activity.getContentResolver().update(uri, contentValues, null, null);
+ }
+ }
}
- private boolean broadcastVideo(final int video_method, final Uri uri, final String filename) {
+ private boolean broadcastVideo(final VideoMethod video_method, final Uri uri, final String filename) {
if( MyDebug.LOG ) {
Log.d(TAG, "broadcastVideo");
Log.d(TAG, "video_method " + video_method);
@@ -2244,7 +2521,24 @@ public class MyApplicationInterface extends BasicApplicationInterface {
Log.d(TAG, "filename " + filename);
}
boolean done = false;
- if( video_method == VIDEOMETHOD_FILE ) {
+ // clear just in case we're unable to update this - don't want an out of date cached uri
+ storageUtils.clearLastMediaScanned();
+ if( video_method == VideoMethod.MEDIASTORE ) {
+ // no need to broadcast when using mediastore
+
+ if( uri != null ) {
+ // in theory this is pointless, as announceUri no longer does anything on Android 7+,
+ // and mediastore method is only used on Android 10+, but keep this just in case
+ // announceUri does something in future
+ storageUtils.announceUri(uri, false, true);
+
+ // we also want to save the uri - we can use the media uri directly, rather than having to scan it
+ storageUtils.setLastMediaScanned(uri, false);
+
+ done = true;
+ }
+ }
+ else if( video_method == VideoMethod.FILE ) {
if( filename != null ) {
File file = new File(filename);
storageUtils.broadcastFile(file, false, true, true);
@@ -2254,10 +2548,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
else {
if( uri != null ) {
// see note in onPictureTaken() for where we call broadcastFile for SAF photos
- File real_file = storageUtils.broadcastUri(uri, false, true, true);
- if( real_file != null ) {
- main_activity.test_last_saved_image = real_file.getAbsolutePath();
- }
+ storageUtils.broadcastUri(uri, false, true, true, false);
done = true;
}
}
@@ -2266,22 +2557,41 @@ public class MyApplicationInterface extends BasicApplicationInterface {
if( MyDebug.LOG )
Log.d(TAG, "test_n_videos_scanned is now: " + test_n_videos_scanned);
}
+
+ if( video_method == VideoMethod.MEDIASTORE && isVideoCaptureIntent() ) {
+ finishVideoIntent(uri);
+ }
return done;
}
+ /** For use when called from a video capture intent. This returns the supplied uri to the
+ * caller, and finishes the activity.
+ */
+ void finishVideoIntent(Uri uri) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "finishVideoIntent:" + uri);
+ Intent output = new Intent();
+ output.setData(uri);
+ main_activity.setResult(Activity.RESULT_OK, output);
+ main_activity.finish();
+ }
+
@Override
- public void deleteUnusedVideo(final int video_method, final Uri uri, final String filename) {
+ public void deleteUnusedVideo(final VideoMethod video_method, final Uri uri, final String filename) {
if( MyDebug.LOG ) {
Log.d(TAG, "deleteUnusedVideo");
Log.d(TAG, "video_method " + video_method);
Log.d(TAG, "uri " + uri);
Log.d(TAG, "filename " + filename);
}
- if( video_method == VIDEOMETHOD_FILE ) {
- trashImage(false, uri, filename, false);
+ if( video_method == VideoMethod.FILE ) {
+ trashImage(LastImagesType.FILE, uri, filename, false);
}
- else if( video_method == VIDEOMETHOD_SAF ) {
- trashImage(true, uri, filename, false);
+ else if( video_method == VideoMethod.SAF ) {
+ trashImage(LastImagesType.SAF, uri, filename, false);
+ }
+ else if( video_method == VideoMethod.MEDIASTORE ) {
+ trashImage(LastImagesType.MEDIASTORE, uri, filename, false);
}
// else can't delete Uri
}
@@ -2357,10 +2667,6 @@ public class MyApplicationInterface extends BasicApplicationInterface {
error_message = getContext().getResources().getString(R.string.failed_to_record_video);
}
main_activity.getPreview().showToast(null, error_message);
- ImageButton view = main_activity.findViewById(R.id.take_photo);
- view.setImageResource(R.drawable.take_video_selector);
- view.setContentDescription( getContext().getResources().getString(R.string.start_video) );
- view.setTag(R.drawable.take_video_selector); // for testing
}
@Override
@@ -2383,11 +2689,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
@Override
public void onFailedCreateVideoFileError() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onFailedCreateVideoFileError");
main_activity.getPreview().showToast(null, R.string.failed_to_save_video);
- ImageButton view = main_activity.findViewById(R.id.take_photo);
- view.setImageResource(R.drawable.take_video_selector);
- view.setContentDescription( getContext().getResources().getString(R.string.start_video) );
- view.setTag(R.drawable.take_video_selector); // for testing
}
@Override
@@ -2530,6 +2834,13 @@ public class MyApplicationInterface extends BasicApplicationInterface {
main_activity.getMainUI().setSeekbarZoom(new_zoom);
}
+ @Override
+ public void requestTakePhoto() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "requestTakePhoto");
+ main_activity.takePicture(false);
+ }
+
/** Switch to the first available camera that is front or back facing as desired.
* @param front_facing Whether to switch to a front or back facing camera.
*/
@@ -2700,6 +3011,8 @@ public class MyApplicationInterface extends BasicApplicationInterface {
public boolean needsStoragePermission() {
if( MyDebug.LOG )
Log.d(TAG, "needsStoragePermission");
+ if( MainActivity.useScopedStorage() )
+ return false; // no longer need storage permission with scoped storage - and shouldn't request it either
return true;
}
@@ -2753,7 +3066,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
// aperture is reset when switching camera, but not when application is paused or switching between photo/video etc
this.aperture = aperture_default;
}
- this.zoom_factor = 0;
+ this.zoom_factor = -1;
}
@Override
@@ -2847,8 +3160,18 @@ public class MyApplicationInterface extends BasicApplicationInterface {
paint.setAlpha(255);
}
paint.setColor(foreground);
+ if( shadow == Shadow.SHADOW_OUTLINE ) {
+ //noinspection PointlessArithmeticExpression
+ float shadow_radius = (1.0f * scale + 0.5f); // convert pt to pixels
+ shadow_radius = Math.max(shadow_radius, 1.0f);
+ paint.setShadowLayer(shadow_radius, 0.0f, 0.0f, background);
+ }
canvas.drawText(text, location_x, location_y, paint);
if( shadow == Shadow.SHADOW_OUTLINE ) {
+ paint.clearShadowLayer(); // set back to default
+ }
+ /*if( shadow == Shadow.SHADOW_OUTLINE ) {
+ // old method (instead of setting shadow layer) - doesn't work correctly on Android 12!
paint.setColor(background);
paint.setStyle(Paint.Style.STROKE);
float current_stroke_width = paint.getStrokeWidth();
@@ -2856,7 +3179,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
canvas.drawText(text, location_x, location_y, paint);
paint.setStyle(Paint.Style.FILL); // set back to default
paint.setStrokeWidth(current_stroke_width); // reset
- }
+ }*/
return text_bounds.bottom - text_bounds.top;
}
@@ -2882,6 +3205,17 @@ public class MyApplicationInterface extends BasicApplicationInterface {
return image_capture_intent;
}
+ boolean isVideoCaptureIntent() {
+ boolean video_capture_intent = false;
+ String action = main_activity.getIntent().getAction();
+ if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "from video capture intent");
+ video_capture_intent = true;
+ }
+ return video_capture_intent;
+ }
+
/** Whether the photos will be part of a burst, even if we're receiving via the non-burst callbacks.
*/
private boolean forceSuffix(PhotoMode photo_mode) {
@@ -2921,6 +3255,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
boolean using_camera2 = main_activity.getPreview().usingCamera2API();
+ boolean using_camera_extensions = isCameraExtensionPref();
ImageSaver.Request.ImageFormat image_format = getImageFormatPref();
boolean store_ypr = sharedPreferences.getBoolean(PreferenceKeys.AddYPRToComments, false) &&
main_activity.getPreview().hasLevelAngle() &&
@@ -2953,7 +3288,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
String preference_stamp_dateformat = this.getStampDateFormatPref();
String preference_stamp_timeformat = this.getStampTimeFormatPref();
String preference_stamp_gpsformat = this.getStampGPSFormatPref();
- String preference_stamp_geo_address = this.getStampGeoAddressPref();
+ //String preference_stamp_geo_address = this.getStampGeoAddressPref();
String preference_units_distance = this.getUnitsDistancePref();
boolean panorama_crop = sharedPreferences.getString(PreferenceKeys.PanoramaCropPreferenceKey, "preference_panorama_crop_on").equals("preference_panorama_crop_on");
boolean store_location = getGeotaggingPref() && getLocation() != null;
@@ -3055,7 +3390,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
photo_mode == PhotoMode.NoiseReduction ? ImageSaver.Request.ProcessType.AVERAGE : ImageSaver.Request.ProcessType.PANORAMA,
save_base,
image_capture_intent, image_capture_intent_uri,
- using_camera2,
+ using_camera2, using_camera_extensions,
image_format, image_quality,
do_auto_stabilise, level_angle, photo_mode == PhotoMode.Panorama,
is_front_facing,
@@ -3064,7 +3399,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
iso,
exposure_time,
zoom_factor,
- preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat, preference_stamp_geo_address, preference_units_distance,
+ preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat,
+ //preference_stamp_geo_address,
+ preference_units_distance,
panorama_crop,
store_location, location, store_geo_direction, geo_direction,
pitch_angle, store_ypr,
@@ -3097,7 +3434,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
force_suffix ? (n_capture_images-1) : 0,
save_expo, images,
image_capture_intent, image_capture_intent_uri,
- using_camera2,
+ using_camera2, using_camera_extensions,
image_format, image_quality,
do_auto_stabilise, level_angle,
is_front_facing,
@@ -3107,7 +3444,9 @@ public class MyApplicationInterface extends BasicApplicationInterface {
iso,
exposure_time,
zoom_factor,
- preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat, preference_stamp_geo_address, preference_units_distance,
+ preference_stamp, preference_textstamp, font_size, color, pref_style, preference_stamp_dateformat, preference_stamp_timeformat, preference_stamp_gpsformat,
+ //preference_stamp_geo_address,
+ preference_units_distance,
false, // panorama doesn't use this codepath
store_location, location, store_geo_direction, geo_direction,
pitch_angle, store_ypr,
@@ -3231,7 +3570,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
Log.d(TAG, "addLastImage: " + file);
Log.d(TAG, "share?: " + share);
}
- last_images_saf = false;
+ last_images_type = LastImagesType.FILE;
LastImage last_image = new LastImage(file.getAbsolutePath(), share);
last_images.add(last_image);
}
@@ -3241,7 +3580,17 @@ public class MyApplicationInterface extends BasicApplicationInterface {
Log.d(TAG, "addLastImageSAF: " + uri);
Log.d(TAG, "share?: " + share);
}
- last_images_saf = true;
+ last_images_type = LastImagesType.SAF;
+ LastImage last_image = new LastImage(uri, share);
+ last_images.add(last_image);
+ }
+
+ void addLastImageMediaStore(Uri uri, boolean share) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "addLastImageMediaStore: " + uri);
+ Log.d(TAG, "share?: " + share);
+ }
+ last_images_type = LastImagesType.MEDIASTORE;
LastImage last_image = new LastImage(uri, share);
last_images.add(last_image);
}
@@ -3249,7 +3598,7 @@ public class MyApplicationInterface extends BasicApplicationInterface {
void clearLastImages() {
if( MyDebug.LOG )
Log.d(TAG, "clearLastImages");
- last_images_saf = false;
+ last_images_type = LastImagesType.FILE;
last_images.clear();
drawPreview.clearLastImage();
}
@@ -3292,13 +3641,13 @@ public class MyApplicationInterface extends BasicApplicationInterface {
}
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
- private void trashImage(boolean image_saf, Uri image_uri, String image_name, boolean from_user) {
+ private void trashImage(LastImagesType image_type, Uri image_uri, String image_name, boolean from_user) {
if( MyDebug.LOG )
Log.d(TAG, "trashImage");
Preview preview = main_activity.getPreview();
- if( image_saf && image_uri != null ) {
+ if( image_type == LastImagesType.SAF && image_uri != null ) {
if( MyDebug.LOG )
- Log.d(TAG, "Delete: " + image_uri);
+ Log.d(TAG, "Delete SAF: " + image_uri);
File file = storageUtils.getFileFromDocumentUriSAF(image_uri, false); // need to get file before deleting it, as fileFromDocumentUriSAF may depend on the file still existing
try {
if( !DocumentsContract.deleteDocument(main_activity.getContentResolver(), image_uri) ) {
@@ -3324,6 +3673,11 @@ public class MyApplicationInterface extends BasicApplicationInterface {
e.printStackTrace();
}
}
+ else if( image_type == LastImagesType.MEDIASTORE && image_uri != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "Delete MediaStore: " + image_uri);
+ main_activity.getContentResolver().delete(image_uri, null, null);
+ }
else if( image_name != null ) {
if( MyDebug.LOG )
Log.d(TAG, "Delete: " + image_name);
@@ -3344,12 +3698,12 @@ public class MyApplicationInterface extends BasicApplicationInterface {
void trashLastImage() {
if( MyDebug.LOG )
- Log.d(TAG, "trashImage");
+ Log.d(TAG, "trashLastImage");
Preview preview = main_activity.getPreview();
if( preview.isPreviewPaused() ) {
for(int i=0;i= Build.VERSION_CODES.M )
+ flags = flags | PendingIntent.FLAG_IMMUTABLE; // needed for targetting Android 12+, but fine to set it all versions from Android 6 onwards
+ pendingIntent = PendingIntent.getActivity(context, 0, intent, flags);
}
RemoteViews remote_views = new RemoteViews(context.getPackageName(), R.layout.widget_layout);
diff --git a/app/src/main/java/net/sourceforge/opencamera/MyWidgetProviderTakePhoto.java b/app/src/main/java/net/sourceforge/opencamera/MyWidgetProviderTakePhoto.java
index 76bfe385c2fea54fab72c4be19ab588f533729a8..cfea899b50a43b4243bc0537dafa4b33b9af5100 100644
--- a/app/src/main/java/net/sourceforge/opencamera/MyWidgetProviderTakePhoto.java
+++ b/app/src/main/java/net/sourceforge/opencamera/MyWidgetProviderTakePhoto.java
@@ -5,6 +5,7 @@ import android.appwidget.AppWidgetManager;
import android.appwidget.AppWidgetProvider;
import android.content.Context;
import android.content.Intent;
+import android.os.Build;
import android.util.Log;
import android.widget.RemoteViews;
@@ -28,7 +29,11 @@ public class MyWidgetProviderTakePhoto extends AppWidgetProvider {
Log.d(TAG, "appWidgetId: " + appWidgetId);
Intent intent = new Intent(context, TakePhoto.class);
- PendingIntent pendingIntent = PendingIntent.getActivity(context, 0, intent, 0);
+
+ int flags = PendingIntent.FLAG_UPDATE_CURRENT;
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M )
+ flags = flags | PendingIntent.FLAG_IMMUTABLE; // needed for targetting Android 12+, but fine to set it all versions from Android 6 onwards
+ PendingIntent pendingIntent = PendingIntent.getActivity(context, 0, intent, flags);
RemoteViews remote_views = new RemoteViews(context.getPackageName(), R.layout.widget_layout_take_photo);
remote_views.setOnClickPendingIntent(R.id.widget_take_photo, pendingIntent);
diff --git a/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessor.java b/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessor.java
index 378d977d60579d4aee4cf71a75132982136bbfe9..ed2f74045256ffb602a566a80636e2a8fbdde31f 100644
--- a/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessor.java
+++ b/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessor.java
@@ -218,7 +218,7 @@ public class PanoramaProcessor {
* RGBA_8888.
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
- private List createLaplacianPyramid(ScriptC_pyramid_blending script, Bitmap bitmap, int n_levels, @SuppressWarnings("unused") String name) {
+ private List createLaplacianPyramid(ScriptC_pyramid_blending script, Bitmap bitmap, int n_levels, String name) {
if( MyDebug.LOG )
Log.d(TAG, "createLaplacianPyramid");
long time_s = 0;
@@ -497,7 +497,6 @@ public class PanoramaProcessor {
}
}
- @SuppressWarnings("unused")
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
private void saveAllocation(String name, Allocation allocation) {
Bitmap bitmap;
@@ -780,7 +779,7 @@ public class PanoramaProcessor {
}
}
- private static void computeDistancesBetweenMatches(List matches, int st_indx, int nd_indx, int feature_descriptor_radius, @SuppressWarnings("unused") List bitmaps, int [] pixels0, int [] pixels1) {
+ private static void computeDistancesBetweenMatches(List matches, int st_indx, int nd_indx, int feature_descriptor_radius, List bitmaps, int [] pixels0, int [] pixels1) {
final int wid = 2*feature_descriptor_radius+1;
final int wid2 = wid*wid;
for(int indx=st_indx;indx bitmaps, long time_s) {
List histogramInfos = new ArrayList<>();
@@ -3107,6 +3104,12 @@ public class PanoramaProcessor {
Log.d(TAG, "crop_y1: " + crop_y1);
Log.d(TAG, "panorama_height: " + panorama_height);
}
+
+ if( panorama_height <= 0 ) {
+ // can happen if the transforms are such that we move off top or bottom of screen! Better to fail gracefully
+ Log.e(TAG, "crop caused panorama height to become -ve: " + panorama_height);
+ throw new PanoramaProcessorException(PanoramaProcessorException.FAILED_TO_CROP);
+ }
}
Bitmap panorama = Bitmap.createBitmap(panorama_width, panorama_height, Bitmap.Config.ARGB_8888);
diff --git a/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessorException.java b/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessorException.java
index c5dfc178a7b5207b36c0791f6e86bb58955401f6..b0787693a324e50ffab22f1524c1c5f52b6e79e4 100644
--- a/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessorException.java
+++ b/app/src/main/java/net/sourceforge/opencamera/PanoramaProcessorException.java
@@ -6,6 +6,7 @@ package net.sourceforge.opencamera;
public class PanoramaProcessorException extends Exception {
final static public int INVALID_N_IMAGES = 0; // the supplied number of images is not supported
final static public int UNEQUAL_SIZES = 1; // images not of the same resolution
+ final static public int FAILED_TO_CROP = 1; // failed to crop
final private int code;
diff --git a/app/src/main/java/net/sourceforge/opencamera/PermissionHandler.java b/app/src/main/java/net/sourceforge/opencamera/PermissionHandler.java
index ef82c09881d5babbae5277beba29c113f21789a8..a7a1ee36255678c62326719ff9bf6ff3e43685e3 100644
--- a/app/src/main/java/net/sourceforge/opencamera/PermissionHandler.java
+++ b/app/src/main/java/net/sourceforge/opencamera/PermissionHandler.java
@@ -26,6 +26,26 @@ public class PermissionHandler {
final private static int MY_PERMISSIONS_REQUEST_RECORD_AUDIO = 2;
final private static int MY_PERMISSIONS_REQUEST_LOCATION = 3;
+ private boolean camera_denied; // whether the user requested to deny a camera permission
+ private long camera_denied_time_ms; // if denied, the time when this occurred
+ private boolean storage_denied; // whether the user requested to deny a camera permission
+ private long storage_denied_time_ms; // if denied, the time when this occurred
+ private boolean audio_denied; // whether the user requested to deny a camera permission
+ private long audio_denied_time_ms; // if denied, the time when this occurred
+ private boolean location_denied; // whether the user requested to deny a camera permission
+ private long location_denied_time_ms; // if denied, the time when this occurred
+ // In some cases there can be a problem if the user denies a permission, we then get an onResume()
+ // (since application goes into background when showing system UI to request permission) at which
+ // point we try to request permission again! This would happen for camera and storage permissions.
+ // Whilst that isn't necessarily wrong, there would also be a problem if the user says
+ // "Don't ask again", we get stuck in a loop repeatedly asking the OS for permission (and it
+ // repeatedly being automatically denied) causing the UI to become sluggish.
+ // So instead we only try asking again if not within deny_delay_ms of the user denying that
+ // permission.
+ // Time shouldn't be too long, as the user might restart and then not be asked again for camera
+ // or storage permission.
+ final private static long deny_delay_ms = 1000;
+
PermissionHandler(MainActivity main_activity) {
this.main_activity = main_activity;
}
@@ -103,6 +123,11 @@ public class PermissionHandler {
Log.e(TAG, "shouldn't be requesting permissions for pre-Android M!");
return;
}
+ else if( camera_denied && System.currentTimeMillis() < camera_denied_time_ms + deny_delay_ms ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "too soon since user last denied permission");
+ return;
+ }
if( ActivityCompat.shouldShowRequestPermissionRationale(main_activity, Manifest.permission.CAMERA) ) {
// Show an explanation to the user *asynchronously* -- don't block
@@ -126,6 +151,16 @@ public class PermissionHandler {
Log.e(TAG, "shouldn't be requesting permissions for pre-Android M!");
return;
}
+ else if( MainActivity.useScopedStorage() ) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "shouldn't be requesting permissions for scoped storage!");
+ return;
+ }
+ else if( storage_denied && System.currentTimeMillis() < storage_denied_time_ms + deny_delay_ms ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "too soon since user last denied permission");
+ return;
+ }
if( ActivityCompat.shouldShowRequestPermissionRationale(main_activity, Manifest.permission.WRITE_EXTERNAL_STORAGE) ) {
// Show an explanation to the user *asynchronously* -- don't block
@@ -149,6 +184,11 @@ public class PermissionHandler {
Log.e(TAG, "shouldn't be requesting permissions for pre-Android M!");
return;
}
+ else if( audio_denied && System.currentTimeMillis() < audio_denied_time_ms + deny_delay_ms ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "too soon since user last denied permission");
+ return;
+ }
if( ActivityCompat.shouldShowRequestPermissionRationale(main_activity, Manifest.permission.RECORD_AUDIO) ) {
// Show an explanation to the user *asynchronously* -- don't block
@@ -172,6 +212,11 @@ public class PermissionHandler {
Log.e(TAG, "shouldn't be requesting permissions for pre-Android M!");
return;
}
+ else if( location_denied && System.currentTimeMillis() < location_denied_time_ms + deny_delay_ms ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "too soon since user last denied permission");
+ return;
+ }
if( ActivityCompat.shouldShowRequestPermissionRationale(main_activity, Manifest.permission.ACCESS_FINE_LOCATION) ||
ActivityCompat.shouldShowRequestPermissionRationale(main_activity, Manifest.permission.ACCESS_COARSE_LOCATION) ) {
@@ -212,6 +257,8 @@ public class PermissionHandler {
else {
if( MyDebug.LOG )
Log.d(TAG, "camera permission denied");
+ camera_denied = true;
+ camera_denied_time_ms = System.currentTimeMillis();
// permission denied, boo! Disable the
// functionality that depends on this permission.
// Open Camera doesn't need to do anything: the camera will remain closed
@@ -232,6 +279,8 @@ public class PermissionHandler {
else {
if( MyDebug.LOG )
Log.d(TAG, "storage permission denied");
+ storage_denied = true;
+ storage_denied_time_ms = System.currentTimeMillis();
// permission denied, boo! Disable the
// functionality that depends on this permission.
// Open Camera doesn't need to do anything: the camera will remain closed
@@ -252,6 +301,8 @@ public class PermissionHandler {
else {
if( MyDebug.LOG )
Log.d(TAG, "record audio permission denied");
+ audio_denied = true;
+ audio_denied_time_ms = System.currentTimeMillis();
// permission denied, boo! Disable the
// functionality that depends on this permission.
// no need to do anything
@@ -262,17 +313,28 @@ public class PermissionHandler {
case MY_PERMISSIONS_REQUEST_LOCATION:
{
// If request is cancelled, the result arrays are empty.
- if( grantResults.length > 0
+ if( grantResults.length == 2 && (grantResults[0] == PackageManager.PERMISSION_GRANTED || grantResults[1] == PackageManager.PERMISSION_GRANTED) ) {
+ // On Android 12 users can choose to only grant approximation location. This means
+ // one of the permissions will be denied, but as long as one location permission
+ // is granted, we can still go ahead and use location.
+ // Otherwise we have a problem that if user selects approximate location, we end
+ // up turning the location option back off.
+ if( MyDebug.LOG )
+ Log.d(TAG, "location permission granted [1]");
+ main_activity.initLocation();
+ }
+ else if( grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED ) {
- // permission was granted, yay! Do the
- // contacts-related task you need to do.
+ // in theory this code path is now redundant, but keep here just in case
if( MyDebug.LOG )
- Log.d(TAG, "location permission granted");
+ Log.d(TAG, "location permission granted [2]");
main_activity.initLocation();
}
else {
if( MyDebug.LOG )
Log.d(TAG, "location permission denied");
+ location_denied = true;
+ location_denied_time_ms = System.currentTimeMillis();
// permission denied, boo! Disable the
// functionality that depends on this permission.
// for location, seems best to turn the option back off
diff --git a/app/src/main/java/net/sourceforge/opencamera/PreferenceKeys.java b/app/src/main/java/net/sourceforge/opencamera/PreferenceKeys.java
index bb590c8e59e4b44c4f9b4444261468afa7a8af67..ef56fce2a3f4d402f506def284cc03f6ee551b89 100644
--- a/app/src/main/java/net/sourceforge/opencamera/PreferenceKeys.java
+++ b/app/src/main/java/net/sourceforge/opencamera/PreferenceKeys.java
@@ -142,7 +142,7 @@ public class PreferenceKeys {
public static final String StampGPSFormatPreferenceKey = "preference_stamp_gpsformat";
- public static final String StampGeoAddressPreferenceKey = "preference_stamp_geo_address";
+ //public static final String StampGeoAddressPreferenceKey = "preference_stamp_geo_address";
public static final String UnitsDistancePreferenceKey = "preference_units_distance";
@@ -170,6 +170,8 @@ public class PreferenceKeys {
public static final String Camera2FakeFlashPreferenceKey = "preference_camera2_fake_flash";
+ public static final String Camera2DummyCaptureHackPreferenceKey = "preference_camera2_dummy_capture_hack";
+
public static final String Camera2FastBurstPreferenceKey = "preference_camera2_fast_burst";
public static final String Camera2PhotoVideoRecordingPreferenceKey = "preference_camera2_photo_video_recording";
@@ -204,6 +206,10 @@ public class PreferenceKeys {
public static final String SaveLocationSAFPreferenceKey = "preference_save_location_saf";
+ public static final String SaveLocationHistoryBasePreferenceKey = "save_location_history";
+
+ public static final String SaveLocationHistorySAFBasePreferenceKey = "save_location_history_saf";
+
public static final String SavePhotoPrefixPreferenceKey = "preference_save_photo_prefix";
public static final String SaveVideoPrefixPreferenceKey = "preference_save_video_prefix";
@@ -350,5 +356,6 @@ public class PreferenceKeys {
public static final String ShutterSoundPreferenceKey = "preference_shutter_sound";
+ public static final String ImmersiveModePreferenceKey = "preference_immersive_mode";
public static final String AddYPRToComments="preference_comment_ypr";
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/SaveLocationHistory.java b/app/src/main/java/net/sourceforge/opencamera/SaveLocationHistory.java
index 56024ba969e23c4ccdbe42ed46113e2de72af7a4..0ce4172cfd1f0be01ee066b1c24ce421510ea98e 100644
--- a/app/src/main/java/net/sourceforge/opencamera/SaveLocationHistory.java
+++ b/app/src/main/java/net/sourceforge/opencamera/SaveLocationHistory.java
@@ -125,6 +125,21 @@ public class SaveLocationHistory {
return save_location_history.get(index);
}
+ /** Removes a save location entry.
+ * @param index The index to remove.
+ */
+ public void remove(int index) {
+ save_location_history.remove(index);
+ }
+
+ /** Sets a save location entry.
+ * @param index The index to set.
+ * @param element The new entry.
+ */
+ public void set(int index, String element) {
+ save_location_history.set(index, element);
+ }
+
// for testing:
/** Should be used for testing only.
* @param value The value to search the location history for.
diff --git a/app/src/main/java/net/sourceforge/opencamera/SettingsManager.java b/app/src/main/java/net/sourceforge/opencamera/SettingsManager.java
index 22b10d5faad18a49bd018c81ddc2cd5affb097a7..dd6ff0ad5a4b6f27500b25c1325ae308f171c862 100644
--- a/app/src/main/java/net/sourceforge/opencamera/SettingsManager.java
+++ b/app/src/main/java/net/sourceforge/opencamera/SettingsManager.java
@@ -22,7 +22,6 @@ import java.io.OutputStream;
import java.io.StringWriter;
import java.nio.charset.Charset;
import java.util.Map;
-import java.util.Set;
import foundation.e.camera.R;
@@ -218,10 +217,9 @@ public class SettingsManager {
xmlSerializer.startTag(null, doc_tag);
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
- Set set = sharedPreferences.getAll().entrySet();
- for( Object aSet : set) {
- Map.Entry entry = (Map.Entry) aSet;
- String key = (String)entry.getKey();
+ Map map = sharedPreferences.getAll();
+ for( Map.Entry entry : map.entrySet()) {
+ String key = entry.getKey();
Object value = entry.getValue();
if( key != null ) {
String tag_type = null;
@@ -247,9 +245,7 @@ public class SettingsManager {
if( tag_type != null ) {
xmlSerializer.startTag(null, tag_type);
xmlSerializer.attribute(null, "key", key);
- if( value != null ) {
- xmlSerializer.attribute(null, "value", value.toString());
- }
+ xmlSerializer.attribute(null, "value", value.toString());
xmlSerializer.endTag(null, tag_type);
}
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/SpeechControl.java b/app/src/main/java/net/sourceforge/opencamera/SpeechControl.java
deleted file mode 100644
index adf0346af72812c9fcbb9ee46282def12eb3591d..0000000000000000000000000000000000000000
--- a/app/src/main/java/net/sourceforge/opencamera/SpeechControl.java
+++ /dev/null
@@ -1,295 +0,0 @@
-package net.sourceforge.opencamera;
-
-import android.content.Intent;
-import android.content.SharedPreferences;
-import android.os.Bundle;
-import android.os.Handler;
-import android.preference.PreferenceManager;
-import android.speech.RecognitionListener;
-import android.speech.RecognizerIntent;
-import android.speech.SpeechRecognizer;
-import android.util.Log;
-import android.view.View;
-
-import java.util.ArrayList;
-import java.util.Locale;
-
-import foundation.e.camera.R;
-
-/** Manages speech recognition for remote control.
- */
-class SpeechControl {
- private static final String TAG = "SpeechControl";
-
- private final MainActivity main_activity;
-
- private SpeechRecognizer speechRecognizer;
- private boolean speechRecognizerIsStarted;
-
- SpeechControl(final MainActivity main_activity) {
- this.main_activity = main_activity;
- }
-
- void startSpeechRecognizerIntent() {
- if( MyDebug.LOG )
- Log.d(TAG, "startSpeechRecognizerIntent");
- if( speechRecognizer != null ) {
- Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
- intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "en_US"); // since we listen for "cheese", ensure this works even for devices with different language settings
- speechRecognizer.startListening(intent);
- }
- }
-
- void speechRecognizerStarted() {
- if( MyDebug.LOG )
- Log.d(TAG, "speechRecognizerStarted");
- main_activity.getMainUI().audioControlStarted();
- speechRecognizerIsStarted = true;
- }
-
- private void speechRecognizerStopped() {
- if( MyDebug.LOG )
- Log.d(TAG, "speechRecognizerStopped");
- main_activity.getMainUI().audioControlStopped();
- speechRecognizerIsStarted = false;
- }
-
- void initSpeechRecognizer() {
- if( MyDebug.LOG )
- Log.d(TAG, "initSpeechRecognizer");
- // in theory we could create the speech recognizer always (hopefully it shouldn't use battery when not listening?), though to be safe, we only do this when the option is enabled (e.g., just in case this doesn't work on some devices!)
- SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
- boolean want_speech_recognizer = sharedPreferences.getString(PreferenceKeys.AudioControlPreferenceKey, "none").equals("voice");
- if( speechRecognizer == null && want_speech_recognizer ) {
- if( MyDebug.LOG )
- Log.d(TAG, "create new speechRecognizer");
- speechRecognizer = SpeechRecognizer.createSpeechRecognizer(main_activity);
- if( speechRecognizer != null ) {
- speechRecognizerIsStarted = false;
- speechRecognizer.setRecognitionListener(new RecognitionListener() {
- private void restart() {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: restart");
- Handler handler = new Handler();
- handler.postDelayed(new Runnable() {
- public void run() {
- startSpeechRecognizerIntent();
- }
- }, 250);
-
- /*freeSpeechRecognizer();
- Handler handler = new Handler();
- handler.postDelayed(new Runnable() {
- public void run() {
- initSpeechRecognizer();
- startSpeechRecognizerIntent();
- speechRecognizerIsStarted = true;
- }
- }, 500);*/
- }
-
- @Override
- public void onBeginningOfSpeech() {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onBeginningOfSpeech");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- //noinspection UnnecessaryReturnStatement
- return;
- }
- }
-
- @Override
- public void onBufferReceived(byte[] buffer) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onBufferReceived");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- //noinspection UnnecessaryReturnStatement
- return;
- }
- }
-
- @Override
- public void onEndOfSpeech() {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onEndOfSpeech");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- return;
- }
- //speechRecognizerStopped();
- restart();
- }
-
- @Override
- public void onError(int error) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onError: " + error);
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- return;
- }
- if( error != SpeechRecognizer.ERROR_NO_MATCH ) {
- // we sometime receive ERROR_NO_MATCH straight after listening starts
- // it seems that the end is signalled either by ERROR_SPEECH_TIMEOUT or onEndOfSpeech()
- //speechRecognizerStopped();
- /*if( error == SpeechRecognizer.ERROR_RECOGNIZER_BUSY ) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: ERROR_RECOGNIZER_BUSY");
- freeSpeechRecognizer();
-
- Handler handler = new Handler();
- handler.postDelayed(new Runnable() {
- public void run() {
- initSpeechRecognizer();
- startSpeechRecognizerIntent();
- speechRecognizerIsStarted = true;
- }
- }, 500);
- }
- else*/ {
- restart();
- }
- }
- }
-
- @Override
- public void onEvent(int eventType, Bundle params) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onEvent");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- //noinspection UnnecessaryReturnStatement
- return;
- }
- }
-
- @Override
- public void onPartialResults(Bundle partialResults) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onPartialResults");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- //noinspection UnnecessaryReturnStatement
- return;
- }
- }
-
- @Override
- public void onReadyForSpeech(Bundle params) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onReadyForSpeech");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- //noinspection UnnecessaryReturnStatement
- return;
- }
- }
-
- public void onResults(Bundle results) {
- if( MyDebug.LOG )
- Log.d(TAG, "RecognitionListener: onResults");
- if( !speechRecognizerIsStarted ) {
- if( MyDebug.LOG )
- Log.d(TAG, "...but speech recognition already stopped");
- return;
- }
- ArrayList list = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
- boolean found = false;
- final String trigger = "cheese";
- //String debug_toast = "";
- for(int i=0;list != null && i 0 )
- debug_toast += "\n";
- debug_toast += text + " : " + results.getFloatArray(SpeechRecognizer.CONFIDENCE_SCORES)[i];*/
- if( text.toLowerCase(Locale.US).contains(trigger) ) {
- found = true;
- }
- }
- //preview.showToast(null, debug_toast); // debug only!
- if( found ) {
- if( MyDebug.LOG )
- Log.d(TAG, "audio trigger from speech recognition");
- main_activity.audioTrigger();
- }
- else if( list != null && list.size() > 0 ) {
- String toast = list.get(0) + "?";
- if( MyDebug.LOG )
- Log.d(TAG, "unrecognised: " + toast);
- main_activity.getPreview().showToast(main_activity.getAudioControlToast(), toast);
- }
- }
-
- @Override
- public void onRmsChanged(float rmsdB) {
- }
- });
-
- if( !main_activity.getMainUI().inImmersiveMode() ) {
- View speechRecognizerButton = main_activity.findViewById(R.id.audio_control);
- speechRecognizerButton.setVisibility(View.VISIBLE);
- }
- }
- }
- else if( speechRecognizer != null && !want_speech_recognizer ) {
- if( MyDebug.LOG )
- Log.d(TAG, "stop existing SpeechRecognizer");
- stopSpeechRecognizer();
- }
- }
-
- private void freeSpeechRecognizer() {
- if( MyDebug.LOG )
- Log.d(TAG, "freeSpeechRecognizer");
- speechRecognizer.cancel();
- try {
- speechRecognizer.destroy();
- }
- catch(IllegalArgumentException e) {
- // reported from Google Play - unclear why this happens, but might as well catch
- Log.e(TAG, "exception destroying speechRecognizer");
- e.printStackTrace();
- }
- speechRecognizer = null;
- }
-
- void stopSpeechRecognizer() {
- if( MyDebug.LOG )
- Log.d(TAG, "stopSpeechRecognizer");
- if( speechRecognizer != null ) {
- speechRecognizerStopped();
- View speechRecognizerButton = main_activity.findViewById(R.id.audio_control);
- speechRecognizerButton.setVisibility(View.GONE);
- freeSpeechRecognizer();
- }
- }
-
- boolean isStarted() {
- return speechRecognizerIsStarted;
- }
-
- void stopListening() {
- speechRecognizer.stopListening();
- this.speechRecognizerStopped();
- }
-
- /** Whether the speech recognition has been set up (via initSpeechRecognizer()).
- */
- boolean hasSpeechRecognition() {
- return speechRecognizer != null;
- }
-}
diff --git a/app/src/main/java/net/sourceforge/opencamera/StorageUtils.java b/app/src/main/java/net/sourceforge/opencamera/StorageUtils.java
index 8abf1adbd38238b0c0df604fc8a6f39ac3742e27..64f8bf9563faa15776187bd25a6e90d9fe2a86fd 100644
--- a/app/src/main/java/net/sourceforge/opencamera/StorageUtils.java
+++ b/app/src/main/java/net/sourceforge/opencamera/StorageUtils.java
@@ -36,6 +36,7 @@ import android.provider.MediaStore.Video.VideoColumns;
import android.provider.OpenableColumns;
import androidx.annotation.RequiresApi;
import androidx.core.content.ContextCompat;
+
import android.system.Os;
import android.system.StructStatVfs;
import android.util.Log;
@@ -54,8 +55,9 @@ public class StorageUtils {
private final Context context;
private final MyApplicationInterface applicationInterface;
private Uri last_media_scanned;
+ private boolean last_media_scanned_is_raw;
- private final static File base_folder = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM);
+ private final static String RELATIVE_FOLDER_BASE = Environment.DIRECTORY_DCIM;
// for testing:
public volatile boolean failed_to_scan;
@@ -69,8 +71,24 @@ public class StorageUtils {
return last_media_scanned;
}
+ boolean getLastMediaScannedIsRaw() {
+ return last_media_scanned_is_raw;
+ }
+
void clearLastMediaScanned() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "clearLastMediaScanned");
last_media_scanned = null;
+ last_media_scanned_is_raw = false;
+ }
+
+ void setLastMediaScanned(Uri uri, boolean is_raw) {
+ last_media_scanned = uri;
+ last_media_scanned_is_raw = is_raw;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "set last_media_scanned to " + last_media_scanned);
+ Log.d(TAG, " last_media_scanned_is_raw: " + last_media_scanned_is_raw);
+ }
}
/** Sends the intents to announce the new file to other Android applications. E.g., cloud storage applications like
@@ -97,6 +115,7 @@ public class StorageUtils {
if( MyDebug.LOG ) // this code only used for debugging/logging
{
+ @SuppressLint("InlinedApi") // complains this constant only available on API 29 (even though it was available on older versions, but looks like it was moved?)
String[] CONTENT_PROJECTION = { Images.Media.DATA, Images.Media.DISPLAY_NAME, Images.Media.MIME_TYPE, Images.Media.SIZE, Images.Media.DATE_TAKEN, Images.Media.DATE_ADDED };
Cursor c = context.getContentResolver().query(uri, CONTENT_PROJECTION, null, null, null);
if( c == null ) {
@@ -108,11 +127,12 @@ public class StorageUtils {
Log.e(TAG, "Couldn't resolve given uri [2]: " + uri);
}
else {
- String file_path = c.getString(c.getColumnIndex(Images.Media.DATA));
- String file_name = c.getString(c.getColumnIndex(Images.Media.DISPLAY_NAME));
- String mime_type = c.getString(c.getColumnIndex(Images.Media.MIME_TYPE));
- long date_taken = c.getLong(c.getColumnIndex(Images.Media.DATE_TAKEN));
- long date_added = c.getLong(c.getColumnIndex(Images.Media.DATE_ADDED));
+ String file_path = c.getString(c.getColumnIndexOrThrow(Images.Media.DATA));
+ String file_name = c.getString(c.getColumnIndexOrThrow(Images.Media.DISPLAY_NAME));
+ String mime_type = c.getString(c.getColumnIndexOrThrow(Images.Media.MIME_TYPE));
+ @SuppressLint("InlinedApi") // complains this constant only available on API 29 (even though it was available on older versions, but looks like it was moved?)
+ long date_taken = c.getLong(c.getColumnIndexOrThrow(Images.Media.DATE_TAKEN));
+ long date_added = c.getLong(c.getColumnIndexOrThrow(Images.Media.DATE_ADDED));
Log.d(TAG, "file_path: " + file_path);
Log.d(TAG, "file_name: " + file_name);
Log.d(TAG, "mime_type: " + mime_type);
@@ -239,10 +259,14 @@ public class StorageUtils {
Log.d(TAG, "-> uri=" + uri);
}
if( set_last_scanned ) {
+ boolean is_raw = filenameIsRaw(file.getName());
+ setLastMediaScanned(uri, is_raw);
last_media_scanned = uri;
if (is_new_video) {
- Media media = getLatestMedia(is_new_video);
- last_media_scanned = (media != null) ? media.uri : uri;
+ Media media = getLatestMedia(UriType.MEDIASTORE_VIDEOS);
+ if (media != null) {
+ setLastMediaScanned(media.uri, is_raw);
+ }
}
if( MyDebug.LOG )
Log.d(TAG, "set last_media_scanned to " + last_media_scanned);
@@ -250,17 +274,15 @@ public class StorageUtils {
announceUri(uri, is_new_picture, is_new_video);
applicationInterface.scannedFile(file, uri);
- // it seems caller apps seem to prefer the content:// Uri rather than one based on a File
+ // If called from video intent, if not using scoped-storage, we'll have saved using File API (even if user preference is SAF), see
+ // MyApplicationInterface.createOutputVideoMethod().
+ // It seems caller apps seem to prefer the content:// Uri rather than one based on a File
// update for Android 7: seems that passing file uris is now restricted anyway, see https://code.google.com/p/android/issues/detail?id=203555
+ // So we pass the uri back to the caller here.
Activity activity = (Activity)context;
String action = activity.getIntent().getAction();
- if( MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
- if( MyDebug.LOG )
- Log.d(TAG, "from video capture intent");
- Intent output = new Intent();
- output.setData(uri);
- activity.setResult(Activity.RESULT_OK, output);
- activity.finish();
+ if( !MainActivity.useScopedStorage() && MediaStore.ACTION_VIDEO_CAPTURE.equals(action) ) {
+ applicationInterface.finishVideoIntent(uri);
}
}
}
@@ -270,9 +292,22 @@ public class StorageUtils {
/** Wrapper for broadcastFile, when we only have a Uri (e.g., for SAF)
*/
- public File broadcastUri(final Uri uri, final boolean is_new_picture, final boolean is_new_video, final boolean set_last_scanned) {
+ public void broadcastUri(final Uri uri, final boolean is_new_picture, final boolean is_new_video, final boolean set_last_scanned, final boolean image_capture_intent) {
if( MyDebug.LOG )
Log.d(TAG, "broadcastUri: " + uri);
+ /* We still need to broadcastFile for SAF for various reasons:
+ 1. To call storageUtils.announceUri() to broadcast NEW_PICTURE etc.
+ Whilst in theory we could do this directly, it seems external apps that use such broadcasts typically
+ won't know what to do with a SAF based Uri (e.g, Owncloud crashes!) so better to broadcast the Uri
+ corresponding to the real file, if it exists.
+ 2. Whilst the new file seems to be known by external apps such as Gallery without having to call media
+ scanner, I've had reports this doesn't happen when saving to external SD cards. So better to explicitly
+ scan.
+ 3. If set_last_scanned==true, it means we get the media uri which can be used to set the thumbnail uri
+ (see setLastMediaScanned()). This is particularly important when using SAF with scoped storage, as
+ getting the latest media via SAF APIs is (if not cached) very slow! N.B., most gallery apps need a
+ mediastore uri, not the SAF uri.
+ */
File real_file = getFileFromDocumentUriSAF(uri, false);
if( MyDebug.LOG )
Log.d(TAG, "real_file: " + real_file);
@@ -282,17 +317,16 @@ public class StorageUtils {
//Uri media_uri = broadcastFileRaw(real_file, current_date, location);
//announceUri(media_uri, is_new_picture, is_new_video);
broadcastFile(real_file, is_new_picture, is_new_video, set_last_scanned);
- return real_file;
}
- else {
+ else if( !image_capture_intent ) {
if( MyDebug.LOG )
Log.d(TAG, "announce SAF uri");
+ // shouldn't do this for an image capture intent - e.g., causes crash when calling from Google Keep
announceUri(uri, is_new_picture, is_new_video);
}
- return null;
}
- boolean isUsingSAF() {
+ public boolean isUsingSAF() {
// check Android version just to be safe
if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP ) {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(context);
@@ -316,7 +350,7 @@ public class StorageUtils {
}
// only valid if isUsingSAF()
- private Uri getTreeUriSAF() {
+ public Uri getTreeUriSAF() {
String folder_name = getSaveLocationSAF();
return Uri.parse(folder_name);
}
@@ -325,8 +359,23 @@ public class StorageUtils {
return new File(context.getExternalFilesDir(null), "backups");
}
- // valid if whether or not isUsingSAF()
- // but note that if isUsingSAF(), this may return null - it can't be assumed that there is a File corresponding to the SAF Uri
+ /** Valid whether or not isUsingSAF().
+ * Returns the absolute path (in File format) of the image save folder.
+ * Only use this for needing e.g. human-readable strings for UI.
+ * This should not be used to create a File - instead, use getImageFolder().
+ * Note that if isUsingSAF(), this may return null - it can't be assumed that there is a
+ * File corresponding to the SAF Uri.
+ */
+ @TargetApi(Build.VERSION_CODES.LOLLIPOP)
+ public String getImageFolderPath() {
+ File file = getImageFolder();
+ return file == null ? null : file.getAbsolutePath();
+ }
+
+ /** Valid whether or not isUsingSAF().
+ * But note that if isUsingSAF(), this may return null - it can't be assumed that there is a
+ * File corresponding to the SAF Uri.
+ */
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
File getImageFolder() {
File file;
@@ -343,10 +392,35 @@ public class StorageUtils {
return file;
}
+ // only valid if !isUsingSAF()
+ // returns a form for use with RELATIVE_PATH (scoped storage)
+ String getSaveRelativeFolder() {
+ String folder_name = getSaveLocation();
+ return getSaveRelativeFolder(folder_name);
+ }
+
+ // only valid if !isUsingSAF()
+ // returns a form for use with RELATIVE_PATH (scoped storage)
+ private static String getSaveRelativeFolder(String folder_name) {
+ if( folder_name.length() > 0 && folder_name.lastIndexOf('/') == folder_name.length()-1 ) {
+ // ignore final '/' character
+ folder_name = folder_name.substring(0, folder_name.length()-1);
+ }
+ return RELATIVE_FOLDER_BASE + File.separator + folder_name;
+ }
+
public static File getBaseFolder() {
+ final File base_folder = Environment.getExternalStoragePublicDirectory(RELATIVE_FOLDER_BASE);
return base_folder;
}
+ /** Whether the save photo/video location is in a form that represents a full path, or a
+ * sub-folder in DCIM/.
+ */
+ static boolean saveFolderIsFull(String folder_name) {
+ return folder_name.startsWith("/");
+ }
+
// only valid if !isUsingSAF()
private static File getImageFolder(String folder_name) {
File file;
@@ -354,8 +428,7 @@ public class StorageUtils {
// ignore final '/' character
folder_name = folder_name.substring(0, folder_name.length()-1);
}
- //if( folder_name.contains("/") ) {
- if( folder_name.startsWith("/") ) {
+ if( saveFolderIsFull(folder_name) ) {
file = new File(folder_name);
}
else {
@@ -364,6 +437,17 @@ public class StorageUtils {
return file;
}
+ /** Only valid if isUsingSAF()
+ * Returns the absolute path (in File format) of the SAF folder.
+ * Only use this for needing e.g. human-readable strings for UI.
+ * This should not be used to create a File - instead, use getFileFromDocumentUriSAF().
+ */
+ @TargetApi(Build.VERSION_CODES.LOLLIPOP)
+ public String getFilePathFromDocumentUriSAF(Uri uri, boolean is_folder) {
+ File file = getFileFromDocumentUriSAF(uri, is_folder);
+ return file == null ? null : file.getAbsolutePath();
+ }
+
/** Only valid if isUsingSAF()
* This function should only be used as a last resort - we shouldn't generally assume that a Uri represents an actual File, or that
* the File can be obtained anyway.
@@ -371,6 +455,8 @@ public class StorageUtils {
* See:
http://stackoverflow.com/questions/21605493/storage-access-framework-does-not-update-mediascanner-mtp
http://stackoverflow.com/questions/20067508/get-real-path-from-uri-android-kitkat-new-storage-access-framework/
+ Note that when using Android Q's scoped storage, the returned File will be inaccessible. However we still sometimes call this,
+ e.g., to scan with mediascanner or get a human readable string for the path.
Also note that this will return null for media store Uris with Android Q's scoped storage: https://developer.android.com/preview/privacy/scoped-storage
"The DATA column is redacted for each file in the media store."
*/
@@ -492,7 +578,7 @@ public class StorageUtils {
}
private String getDataColumn(Uri uri, String selection, String [] selectionArgs) {
- final String column = "_data";
+ final String column = MediaStore.Images.ImageColumns.DATA;
final String[] projection = {
column
};
@@ -524,7 +610,7 @@ public class StorageUtils {
* See https://developer.android.com/guide/topics/providers/document-provider.html and
* http://stackoverflow.com/questions/5568874/how-to-extract-the-file-name-from-uri-returned-from-intent-action-get-content .
*/
- String getFileName(Uri uri) {
+ public String getFileName(Uri uri) {
if( MyDebug.LOG ) {
Log.d(TAG, "getFileName: " + uri);
Log.d(TAG, "uri has path: " + uri.getPath());
@@ -565,7 +651,7 @@ public class StorageUtils {
return result;
}
- private String createMediaFilename(int type, String suffix, int count, String extension, Date current_date) {
+ String createMediaFilename(int type, String suffix, int count, String extension, Date current_date) {
String index = "";
if( count > 0 ) {
index = "_" + count; // try to find a unique filename
@@ -718,40 +804,56 @@ public class StorageUtils {
}
}
+ /** Return the mime type corresponding to the supplied extension. Supports images only, not video.
+ */
+ public String getImageMimeType(String extension) {
+ String mimeType;
+ switch (extension) {
+ case "dng":
+ mimeType = "image/dng";
+ //mimeType = "image/x-adobe-dng";
+ break;
+ case "webp":
+ mimeType = "image/webp";
+ break;
+ case "png":
+ mimeType = "image/png";
+ break;
+ default:
+ mimeType = "image/jpeg";
+ break;
+ }
+ return mimeType;
+ }
+
+ /** Return the mime type corresponding to the supplied extension. Supports video only, not images.
+ */
+ String getVideoMimeType(String extension) {
+ String mimeType;
+ switch( extension ) {
+ case "3gp":
+ mimeType = "video/3gpp";
+ break;
+ case "webm":
+ mimeType = "video/webm";
+ break;
+ default:
+ mimeType = "video/mp4";
+ break;
+ }
+ return mimeType;
+ }
+
// only valid if isUsingSAF()
@TargetApi(Build.VERSION_CODES.LOLLIPOP)
Uri createOutputMediaFileSAF(int type, String suffix, String extension, Date current_date) throws IOException {
String mimeType;
switch (type) {
case MEDIA_TYPE_IMAGE:
- switch (extension) {
- case "dng":
- mimeType = "image/dng";
- //mimeType = "image/x-adobe-dng";
- break;
- case "webp":
- mimeType = "image/webp";
- break;
- case "png":
- mimeType = "image/png";
- break;
- default:
- mimeType = "image/jpeg";
- break;
- }
+ mimeType = getImageMimeType(extension);
break;
case MEDIA_TYPE_VIDEO:
- switch( extension ) {
- case "3gp":
- mimeType = "video/3gpp";
- break;
- case "webm":
- mimeType = "video/webm";
- break;
- default:
- mimeType = "video/mp4";
- break;
- }
+ mimeType = getVideoMimeType(extension);
break;
case MEDIA_TYPE_PREFS:
case MEDIA_TYPE_GYRO_INFO:
@@ -769,14 +871,16 @@ public class StorageUtils {
}
static class Media {
- final long id;
+ final boolean mediastore; // whether uri is from mediastore
+ final long id; // for mediastore==true only
final boolean video;
final Uri uri;
final long date;
- final int orientation;
+ final int orientation; // for mediastore==true, video==false only
final String filename; // this should correspond to DISPLAY_NAME (so available with scoped storage) - so this includes file extension, but not full path
- Media(long id, boolean video, Uri uri, long date, int orientation, String filename) {
+ Media(boolean mediastore, long id, boolean video, Uri uri, long date, int orientation, String filename) {
+ this.mediastore = mediastore;
this.id = id;
this.video = video;
this.uri = uri;
@@ -784,14 +888,52 @@ public class StorageUtils {
this.orientation = orientation;
this.filename = filename;
}
+
+ /** Returns a mediastore uri. If this Media object was not created by a mediastore uri, then
+ * this will try to convert using MediaStore.getMediaUri(), but if this fails the function
+ * will return null.
+ */
+ Uri getMediaStoreUri(Context context) {
+ if( this.mediastore )
+ return this.uri;
+ else {
+ try {
+ // should only have allowed mediastore==null when using scoped storage
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ return MediaStore.getMediaUri(context, this.uri);
+ }
+ }
+ catch(Exception e) {
+ e.printStackTrace();
+ }
+ return null;
+ }
+ }
+ }
+
+ static boolean filenameIsRaw(String filename) {
+ return filename.toLowerCase(Locale.US).endsWith(".dng");
+ }
+
+ private static String filenameWithoutExtension(String filename) {
+ String filename_without_ext = filename.toLowerCase(Locale.US);
+ if( filename_without_ext.indexOf(".") > 0 )
+ filename_without_ext = filename_without_ext.substring(0, filename_without_ext.lastIndexOf("."));
+ return filename_without_ext;
+ }
+
+ private enum UriType {
+ MEDIASTORE_IMAGES,
+ MEDIASTORE_VIDEOS
}
- private Media getLatestMediaCore(Uri baseUri, String bucket_id, boolean video) {
+ @SuppressLint("InlinedApi") // complains MediaColumns constants only available on API 29 (even though it was available on older versions, but looks like it was moved?); for some reason doesn't allow putting this at the actual comments?!
+ private Media getLatestMediaCore(Uri baseUri, String bucket_id, UriType uri_type) {
if( MyDebug.LOG ) {
Log.d(TAG, "getLatestMediaCore");
Log.d(TAG, "baseUri: " + baseUri);
Log.d(TAG, "bucket_id: " + bucket_id);
- Log.d(TAG, "video: " + video);
+ Log.d(TAG, "uri_type: " + uri_type);
}
Media media = null;
@@ -801,32 +943,62 @@ public class StorageUtils {
final int column_name_c = 3; // filename (without path), including extension
final int column_orientation_c = 4; // for images only*/
final int column_name_c = 2; // filename (without path), including extension
- final int column_orientation_c = 3; // for images only
- String [] projection = video ?
- new String[] {VideoColumns._ID, VideoColumns.DATE_TAKEN, VideoColumns.DISPLAY_NAME} :
- new String[] {ImageColumns._ID, ImageColumns.DATE_TAKEN, ImageColumns.DISPLAY_NAME, ImageColumns.ORIENTATION};
+ final int column_orientation_c = 3; // for mediastore images only
+ String [] projection;
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ projection = new String[] {ImageColumns._ID, ImageColumns.DATE_TAKEN, ImageColumns.DISPLAY_NAME, ImageColumns.ORIENTATION};
+ break;
+ case MEDIASTORE_VIDEOS:
+ projection = new String[] {VideoColumns._ID, VideoColumns.DATE_TAKEN, VideoColumns.DISPLAY_NAME};
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
// for images, we need to search for JPEG/etc and RAW, to support RAW only mode (even if we're not currently in that mode, it may be that previously the user did take photos in RAW only mode)
+ // if updating this code for supported mime types, remember to also update getLatestMediaSAF()
/*String selection = video ? "" : ImageColumns.MIME_TYPE + "='image/jpeg' OR " +
ImageColumns.MIME_TYPE + "='image/webp' OR " +
ImageColumns.MIME_TYPE + "='image/png' OR " +
ImageColumns.MIME_TYPE + "='image/x-adobe-dng'";*/
String selection = "";
- if( bucket_id != null )
- selection = (video ? VideoColumns.BUCKET_ID : ImageColumns.BUCKET_ID) + " = " + bucket_id;
- if( !video ) {
- boolean and = selection.length() > 0;
- if( and )
- selection += " AND ( ";
- selection += ImageColumns.MIME_TYPE + "='image/jpeg' OR " +
- ImageColumns.MIME_TYPE + "='image/webp' OR " +
- ImageColumns.MIME_TYPE + "='image/png' OR " +
- ImageColumns.MIME_TYPE + "='image/x-adobe-dng'";
- if( and )
- selection += " )";
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ {
+ if( bucket_id != null )
+ selection = ImageColumns.BUCKET_ID + " = " + bucket_id;
+ boolean and = selection.length() > 0;
+ if( and )
+ selection += " AND ( ";
+ selection += ImageColumns.MIME_TYPE + "='image/jpeg' OR " +
+ ImageColumns.MIME_TYPE + "='image/webp' OR " +
+ ImageColumns.MIME_TYPE + "='image/png' OR " +
+ ImageColumns.MIME_TYPE + "='image/x-adobe-dng'";
+ if( and )
+ selection += " )";
+ break;
+ }
+ case MEDIASTORE_VIDEOS:
+ if( bucket_id != null )
+ selection = VideoColumns.BUCKET_ID + " = " + bucket_id;
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
}
if( MyDebug.LOG )
Log.d(TAG, "selection: " + selection);
- String order = video ? VideoColumns.DATE_TAKEN + " DESC," + VideoColumns._ID + " DESC" : ImageColumns.DATE_TAKEN + " DESC," + ImageColumns._ID + " DESC";
+ String order;
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ order = ImageColumns.DATE_TAKEN + " DESC," + ImageColumns._ID + " DESC";
+ break;
+ case MEDIASTORE_VIDEOS:
+ //noinspection DuplicateBranchesInSwitch
+ order = VideoColumns.DATE_TAKEN + " DESC," + VideoColumns._ID + " DESC";
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
Cursor cursor = null;
// we know we only want the most recent image - however we may need to scan forward if we find a RAW, to see if there's
@@ -890,14 +1062,12 @@ public class StorageUtils {
Log.d(TAG, "filename: " + filename);
}
// in theory now that we use DISPLAY_NAME instead of DATA (for path), this should always be non-null, but check just in case
- if( filename != null && filename.toLowerCase(Locale.US).endsWith(".dng") ) {
+ if( filename != null && filenameIsRaw(filename) ) {
if( MyDebug.LOG )
Log.d(TAG, "try to find a non-RAW version of the DNG");
int dng_pos = cursor.getPosition();
boolean found_non_raw = false;
- String filename_without_ext = filename.toLowerCase(Locale.US);
- if( filename_without_ext.indexOf(".") > 0 )
- filename_without_ext = filename_without_ext.substring(0, filename_without_ext.lastIndexOf("."));
+ String filename_without_ext = filenameWithoutExtension(filename);
if( MyDebug.LOG )
Log.d(TAG, "filename_without_ext: " + filename_without_ext);
while( cursor.moveToNext() ) {
@@ -909,9 +1079,7 @@ public class StorageUtils {
Log.d(TAG, "done scanning, couldn't find filename");
break;
}
- String next_filename_without_ext = next_filename.toLowerCase(Locale.US);
- if( next_filename_without_ext.indexOf(".") > 0 )
- next_filename_without_ext = next_filename_without_ext.substring(0, next_filename_without_ext.lastIndexOf("."));
+ String next_filename_without_ext = filenameWithoutExtension(next_filename);
if( MyDebug.LOG )
Log.d(TAG, "next_filename_without_ext: " + next_filename_without_ext);
if( !filename_without_ext.equals(next_filename_without_ext) ) {
@@ -921,21 +1089,16 @@ public class StorageUtils {
break;
}
// so we've found another file with matching filename - is it a JPEG/etc?
- if( next_filename.toLowerCase(Locale.US).endsWith(".jpg") ) {
- if( MyDebug.LOG )
- Log.d(TAG, "found equivalent jpeg");
- found_non_raw = true;
- break;
- }
- else if( next_filename.toLowerCase(Locale.US).endsWith(".webp") ) {
+ // we've already restricted the query to the image types we're interested in, so
+ // only need to check that it isn't another DNG (which would be strange, as it
+ // would mean a duplicate filename, but check just in case!)
+ if( filenameIsRaw(next_filename) ) {
if( MyDebug.LOG )
- Log.d(TAG, "found equivalent webp");
- found_non_raw = true;
- break;
+ Log.d(TAG, "found another dng!");
}
- else if( next_filename.toLowerCase(Locale.US).endsWith(".png") ) {
+ else {
if( MyDebug.LOG )
- Log.d(TAG, "found equivalent png");
+ Log.d(TAG, "found equivalent non-dng");
found_non_raw = true;
break;
}
@@ -950,12 +1113,41 @@ public class StorageUtils {
long id = cursor.getLong(column_id_c);
long date = cursor.getLong(column_date_taken_c);
- int orientation = video ? 0 : cursor.getInt(column_orientation_c);
+ int orientation = (uri_type == UriType.MEDIASTORE_IMAGES) ? cursor.getInt(column_orientation_c) : 0;
Uri uri = ContentUris.withAppendedId(baseUri, id);
String filename = cursor.getString(column_name_c);
if( MyDebug.LOG )
- Log.d(TAG, "found most recent uri for " + (video ? "video" : "images") + ": " + uri);
- media = new Media(id, video, uri, date, orientation, filename);
+ Log.d(TAG, "found most recent uri for " + uri_type + ": " + uri);
+
+ boolean video;
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ video = false;
+ break;
+ case MEDIASTORE_VIDEOS:
+ video = true;
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "video: " + video);
+
+ media = new Media(true, id, video, uri, date, orientation, filename);
+
+ if( MyDebug.LOG ) {
+ // debug
+ if( cursor.moveToFirst() ) {
+ do {
+ long this_id = cursor.getLong(column_id_c);
+ long this_date = cursor.getLong(column_date_taken_c);
+ Uri this_uri = ContentUris.withAppendedId(baseUri, this_id);
+ String this_filename = cursor.getString(column_name_c);
+ Log.d(TAG, "Date: " + this_date + " ID: " + this_id + " Name: " + this_filename + " Uri: " + this_uri);
+ }
+ while( cursor.moveToNext() );
+ }
+ }
}
else {
if( MyDebug.LOG )
@@ -974,46 +1166,255 @@ public class StorageUtils {
}
}
+ if( MyDebug.LOG )
+ Log.d(TAG, "return latest media: " + media);
return media;
}
- private Media getLatestMedia(boolean video) {
+ /** Used when using Storage Access Framework AND scoped storage.
+ * This is because with scoped storage, we don't request READ_EXTERNAL_STORAGE (as
+ * recommended). It's meant to be the case that applications should still be able to see files
+ * that they own - but whilst this is true when images are saved using mediastore API, this is
+ * NOT true when saving with Storage Access Framework - they don't show up in mediastore
+ * queries (even though they've definitely been added to the mediastore). So instead we read
+ * using the SAF uri, and if we need the media uri (e.g., to pass to Gallery application), use
+ * Media.getMediaStoreUri(). What a mess!
+ */
+ @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
+ private Media getLatestMediaSAF(Uri treeUri) {
+ if (MyDebug.LOG)
+ Log.d(TAG, "getLatestMediaSAF: " + treeUri);
+
+ Media media = null;
+
+ Uri baseUri;
+ try {
+ String parentDocUri = DocumentsContract.getTreeDocumentId(treeUri);
+ baseUri = DocumentsContract.buildChildDocumentsUriUsingTree(treeUri, parentDocUri);
+ }
+ catch(Exception e) {
+ // DocumentsContract.getTreeDocumentId throws IllegalArgumentException if the uri is
+ // invalid. Unclear if this can happen in practice - this happens in test
+ // testSaveFolderHistorySAF() but only because we test a dummy invalid SAF uri. But
+ // seems no harm catching it in case this can happen (e.g., especially if restoring
+ // backed up preferences from a different device?) Better to just show nothing in the
+ // thumbnail, rather than crashing!
+ // N.B., we catch Exception is otherwise compiler complains IllegalArgumentException
+ // isn't ever thrown - even though it is!?
+ Log.e(TAG, "Exception using treeUri: " + treeUri);
+ return media;
+ }
if( MyDebug.LOG )
- Log.d(TAG, "getLatestMedia: " + (video ? "video" : "images"));
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && ContextCompat.checkSelfPermission(context, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED ) {
+ Log.d(TAG, "baseUri: " + baseUri);
+
+ final int column_id_c = 0;
+ final int column_date_c = 1;
+ final int column_name_c = 2; // filename (without path), including extension
+ final int column_mime_c = 3;
+ String [] projection = new String[] {DocumentsContract.Document.COLUMN_DOCUMENT_ID, DocumentsContract.Document.COLUMN_LAST_MODIFIED, DocumentsContract.Document.COLUMN_DISPLAY_NAME, DocumentsContract.Document.COLUMN_MIME_TYPE};
+
+ // Note, it appears that when querying DocumentsContract, basic query functionality like selection, ordering, are ignored(!).
+ // See: https://stackoverflow.com/questions/52770188/how-to-filter-the-results-of-a-query-with-buildchilddocumentsuriusingtree
+ // https://stackoverflow.com/questions/56263620/contentresolver-query-on-documentcontract-lists-all-files-disregarding-selection
+ // So, we have to do it ourselves.
+
+ Cursor cursor = null;
+ try {
+ cursor = context.getContentResolver().query(baseUri, projection, null, null, null);
+ if( cursor != null && cursor.moveToFirst() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "found: " + cursor.getCount());
+
+ Uri latest_uri = null;
+ long latest_date = 0;
+ String latest_filename = null;
+ boolean latest_is_video = false;
+
+ // as well as scanning for the most recent image, we also keep track of the most recent non-RAW image,
+ // in case we want to prefer that when the most recent
+ Uri nonraw_latest_uri = null;
+ long nonraw_latest_date = 0;
+ String nonraw_latest_filename = null;
+
+ do {
+ long this_date = cursor.getLong(column_date_c);
+
+ String doc_id = cursor.getString(column_id_c);
+ Uri this_uri = DocumentsContract.buildDocumentUriUsingTree(treeUri, doc_id);
+ String this_mime_type = cursor.getString(column_mime_c);
+
+ // if updating this code for allowed mime types, also update corresponding code in getLatestMediaCore()
+ boolean is_allowed;
+ boolean this_is_video;
+ switch( this_mime_type ) {
+ case "image/jpeg":
+ case "image/webp":
+ case "image/png":
+ case "image/x-adobe-dng":
+ is_allowed = true;
+ this_is_video = false;
+ break;
+ case "video/3gpp":
+ case "video/webm":
+ case "video/mp4":
+ // n.b., perhaps we should just allow video/*, but we should still disallow .SRT files!
+ is_allowed = true;
+ this_is_video = true;
+ break;
+ default:
+ // skip unwanted file format
+ is_allowed = false;
+ this_is_video = false;
+ break;
+ }
+ if( !is_allowed ) {
+ continue;
+ }
+
+ String this_filename = cursor.getString(column_name_c);
+ /*if( MyDebug.LOG ) {
+ Log.d(TAG, "Date: " + this_date + " doc_id: " + doc_id + " Name: " + this_filename + " Uri: " + this_uri);
+ }*/
+
+ if( latest_uri == null || this_date > latest_date ) {
+ latest_uri = this_uri;
+ latest_date = this_date;
+ latest_filename = this_filename;
+ latest_is_video = this_is_video;
+ }
+ if( !this_is_video && !filenameIsRaw(this_filename) ) {
+ if( nonraw_latest_uri == null || this_date > nonraw_latest_date ) {
+ nonraw_latest_uri = this_uri;
+ nonraw_latest_date = this_date;
+ nonraw_latest_filename = this_filename;
+ }
+ }
+ }
+ while( cursor.moveToNext() );
+
+ if( latest_uri == null ) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "couldn't find latest uri");
+ }
+ else {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "latest_uri: " + latest_uri);
+ Log.d(TAG, "nonraw_latest_uri: " + nonraw_latest_uri);
+ }
+
+ if( !latest_is_video && filenameIsRaw(latest_filename) && nonraw_latest_uri != null ) {
+ // prefer non-RAW to RAW? check filenames without extensions match
+ String filename_without_ext = filenameWithoutExtension(latest_filename);
+ String next_filename_without_ext = filenameWithoutExtension(nonraw_latest_filename);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "filename_without_ext: " + filename_without_ext);
+ Log.d(TAG, "next_filename_without_ext: " + next_filename_without_ext);
+ }
+ if( filename_without_ext.equals(next_filename_without_ext) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "prefer non-RAW to RAW");
+ latest_uri = nonraw_latest_uri;
+ latest_date = nonraw_latest_date;
+ latest_filename = nonraw_latest_filename;
+ // video is unchanged
+ }
+ }
+
+ media = new Media(false,0, latest_is_video, latest_uri, latest_date, 0, latest_filename);
+ }
+
+ /*if( MyDebug.LOG ) {
+ // debug
+ if( cursor.moveToFirst() ) {
+ do {
+ long this_id = cursor.getLong(column_id_c);
+ long this_date = cursor.getLong(column_date_taken_c);
+ Uri this_uri = ContentUris.withAppendedId(baseUri, this_id);
+ String this_filename = cursor.getString(column_name_c);
+ Log.d(TAG, "Date: " + this_date + " ID: " + this_id + " Name: " + this_filename + " Uri: " + this_uri);
+ }
+ while( cursor.moveToNext() );
+ }
+ }*/
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "mediastore returned no media");
+ }
+ }
+ catch(Exception e) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "Exception trying to find latest media");
+ e.printStackTrace();
+ }
+ finally {
+ if( cursor != null ) {
+ cursor.close();
+ }
+ }
+
+ if( MyDebug.LOG )
+ Log.d(TAG, "return latest media: " + media);
+ return media;
+ }
+
+ private Media getLatestMedia(UriType uri_type) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "getLatestMedia: " + uri_type);
+ if( !MainActivity.useScopedStorage() && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && ContextCompat.checkSelfPermission(context, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED ) {
// needed for Android 6, in case users deny storage permission, otherwise we get java.lang.SecurityException from ContentResolver.query()
// see https://developer.android.com/training/permissions/requesting.html
// we now request storage permission before opening the camera, but keep this here just in case
// we restrict check to Android 6 or later just in case, see note in LocationSupplier.setupLocationListener()
+ // update for scoped storage: here we should no longer need READ_EXTERNAL_STORAGE (which we won't have), instead we'll only be able to see
+ // media created by Open Camera, which is fine
if( MyDebug.LOG )
Log.e(TAG, "don't have READ_EXTERNAL_STORAGE permission");
return null;
}
- File save_folder = getImageFolder(); // may be null if using SAF
+ String save_folder = getImageFolderPath(); // may be null if using SAF
if( MyDebug.LOG )
Log.d(TAG, "save_folder: " + save_folder);
String bucket_id = null;
if( save_folder != null ) {
- bucket_id = String.valueOf(save_folder.getAbsolutePath().toLowerCase().hashCode());
+ bucket_id = String.valueOf(save_folder.toLowerCase().hashCode());
}
if( MyDebug.LOG )
Log.d(TAG, "bucket_id: " + bucket_id);
- Uri baseUri = video ? Video.Media.EXTERNAL_CONTENT_URI : MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
- Media media = getLatestMediaCore(baseUri, bucket_id, video);
+ Uri baseUri;
+ switch( uri_type ) {
+ case MEDIASTORE_IMAGES:
+ baseUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
+ break;
+ case MEDIASTORE_VIDEOS:
+ baseUri = Video.Media.EXTERNAL_CONTENT_URI;
+ break;
+ default:
+ throw new RuntimeException("unknown uri_type: " + uri_type);
+ }
+
+ if( MyDebug.LOG )
+ Log.d(TAG, "baseUri: " + baseUri);
+ Media media = getLatestMediaCore(baseUri, bucket_id, uri_type);
if( media == null && bucket_id != null ) {
if( MyDebug.LOG )
Log.d(TAG, "fall back to checking any folder");
- media = getLatestMediaCore(baseUri, null, video);
+ media = getLatestMediaCore(baseUri, null, uri_type);
}
return media;
}
Media getLatestMedia() {
- Media image_media = getLatestMedia(false);
- Media video_media = getLatestMedia(true);
+ if( MainActivity.useScopedStorage() && this.isUsingSAF() ) {
+ Uri treeUri = this.getTreeUriSAF();
+ return getLatestMediaSAF(treeUri);
+ }
+
+ Media image_media = getLatestMedia(UriType.MEDIASTORE_IMAGES);
+ Media video_media = getLatestMedia(UriType.MEDIASTORE_VIDEOS);
Media media = null;
if( image_media != null && video_media == null ) {
if( MyDebug.LOG )
@@ -1051,13 +1452,14 @@ public class StorageUtils {
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private long freeMemorySAF() {
Uri treeUri = applicationInterface.getStorageUtils().getTreeUriSAF();
+ ParcelFileDescriptor pfd = null;
if( MyDebug.LOG )
Log.d(TAG, "treeUri: " + treeUri);
try {
Uri docUri = DocumentsContract.buildDocumentUriUsingTree(treeUri, DocumentsContract.getTreeDocumentId(treeUri));
if( MyDebug.LOG )
Log.d(TAG, "docUri: " + docUri);
- ParcelFileDescriptor pfd = context.getContentResolver().openFileDescriptor(docUri, "r");
+ pfd = context.getContentResolver().openFileDescriptor(docUri, "r");
if( pfd == null ) { // just in case
Log.e(TAG, "pfd is null!");
throw new FileNotFoundException();
@@ -1083,6 +1485,15 @@ public class StorageUtils {
// now.
e.printStackTrace();
}
+ finally {
+ try {
+ if( pfd != null )
+ pfd.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ }
return -1;
}
@@ -1095,6 +1506,7 @@ public class StorageUtils {
// if we fail for SAF, don't fall back to the methods below, as this may be incorrect (especially for external SD card)
return freeMemorySAF();
}
+ // n.b., StatFs still seems to work with Android 10's scoped storage... (and there doesn't seem to be an official non-File based equivalent)
try {
File folder = getImageFolder();
if( folder == null ) {
@@ -1122,7 +1534,7 @@ public class StorageUtils {
if( !isUsingSAF() ) {
// getSaveLocation() only valid if !isUsingSAF()
String folder_name = getSaveLocation();
- if( !folder_name.startsWith("/") ) {
+ if( !saveFolderIsFull(folder_name) ) {
File folder = getBaseFolder();
StatFs statFs = new StatFs(folder.getAbsolutePath());
long blocks, size;
diff --git a/app/src/main/java/net/sourceforge/opencamera/TakePhoto.java b/app/src/main/java/net/sourceforge/opencamera/TakePhoto.java
index a80f5a1ad17a37212ac87cd8d360bc6833ce2628..982977ed5244765a408e7a7e314a30caf297ff43 100644
--- a/app/src/main/java/net/sourceforge/opencamera/TakePhoto.java
+++ b/app/src/main/java/net/sourceforge/opencamera/TakePhoto.java
@@ -1,15 +1,16 @@
package net.sourceforge.opencamera;
-import android.app.Activity;
import android.content.Intent;
import android.os.Bundle;
import android.util.Log;
+import androidx.appcompat.app.AppCompatActivity;
+
/** Entry Activity for the "take photo" widget (see MyWidgetProviderTakePhoto).
* This redirects to MainActivity, but uses an intent extra/bundle to pass the
* "take photo" request.
*/
-public class TakePhoto extends Activity {
+public class TakePhoto extends AppCompatActivity {
private static final String TAG = "TakePhoto";
// Usually passing data via intent is preferred to using statics - however here a static is better for security,
diff --git a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController.java b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController.java
index 49835b9b83db241f0a033ac754e3687eb588b92d..c8f6e12c746b2080985e9a3d2cea7092b23af364 100644
--- a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController.java
+++ b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController.java
@@ -15,6 +15,8 @@ import android.util.Log;
import android.view.SurfaceHolder;
import android.view.TextureView;
+import androidx.annotation.NonNull;
+
/** CameraController is an abstract class that wraps up the access/control to
* the Android camera, so that the rest of the application doesn't have to
* deal directly with the Android camera API. It also allows us to support
@@ -38,12 +40,11 @@ public abstract class CameraController {
public static final String ISO_DEFAULT = "auto";
public static final long EXPOSURE_TIME_DEFAULT = 1000000000L/30; // note, responsibility of callers to check that this is within the valid min/max range
- public static final int ISO_FOR_DARK = 1100;
public static final int N_IMAGES_NR_DARK = 8;
public static final int N_IMAGES_NR_DARK_LOW_LIGHT = 15;
// for testing:
- volatile int count_camera_parameters_exception;
+ public volatile int count_camera_parameters_exception;
public volatile int count_precapture_timeout;
public volatile boolean test_wait_capture_result; // whether to test delayed capture result in Camera2 API
public volatile boolean test_release_during_photo; // for Camera2 API, will force takePictureAfterPrecapture() to call release() on UI thread
@@ -65,6 +66,7 @@ public abstract class CameraController {
public List video_sizes;
public List video_sizes_high_speed; // may be null if high speed not supported
public List preview_sizes;
+ public List supported_extensions; // if non-null, list of supported camera vendor extensions, see https://developer.android.com/reference/android/hardware/camera2/CameraExtensionCharacteristics
public List supported_flash_values;
public List supported_focus_values;
public float [] apertures; // may be null if not supported, else will have at least 2 values
@@ -136,7 +138,7 @@ public abstract class CameraController {
}
// Android docs and FindBugs recommend that Comparators also be Serializable
- public static class RangeSorter implements Comparator, Serializable {
+ static class RangeSorter implements Comparator, Serializable {
private static final long serialVersionUID = 5802214721073728212L;
@Override
public int compare(int[] o1, int[] o2) {
@@ -148,7 +150,7 @@ public abstract class CameraController {
/* Sorts resolutions from highest to lowest, by area.
* Android docs and FindBugs recommend that Comparators also be Serializable
*/
- public static class SizeSorter implements Comparator, Serializable {
+ static class SizeSorter implements Comparator, Serializable {
private static final long serialVersionUID = 5802214721073718212L;
@Override
@@ -161,6 +163,7 @@ public abstract class CameraController {
public final int width;
public final int height;
public boolean supports_burst; // for photo
+ public List supported_extensions; // for photo and preview: if non-null, list of supported camera vendor extensions
final List fps_ranges; // for video
public final boolean high_speed; // for video
@@ -174,7 +177,17 @@ public abstract class CameraController {
}
public Size(int width, int height) {
- this(width, height, new ArrayList(), false);
+ this(width, height, new ArrayList<>(), false);
+ }
+
+ /** Whether this size supports the requested burst and/or extension
+ */
+ public boolean supportsRequirements(boolean want_burst, boolean want_extension, int extension) {
+ return (!want_burst || this.supports_burst) && (!want_extension || this.supportsExtension(extension));
+ }
+
+ public boolean supportsExtension(int extension) {
+ return supported_extensions != null && supported_extensions.contains(extension);
}
boolean supportsFrameRate(double fps) {
@@ -203,6 +216,7 @@ public abstract class CameraController {
return width*41 + height;
}
+ @NonNull
public String toString() {
StringBuilder s = new StringBuilder();
for (int[] f : this.fps_ranges) {
@@ -276,10 +290,13 @@ public abstract class CameraController {
public static class Face {
public final int score;
- /* The has values from [-1000,-1000] (for top-left) to [1000,1000] (for bottom-right) for whatever is
+ /* The rect has values from [-1000,-1000] (for top-left) to [1000,1000] (for bottom-right) for whatever is
* the current field of view (i.e., taking zoom into account).
*/
public final Rect rect;
+ /** The temp rect is temporary storage that can be used by callers.
+ */
+ public final Rect temp = new Rect();
Face(int score, Rect rect) {
this.score = score;
@@ -369,6 +386,9 @@ public abstract class CameraController {
public abstract CameraController.Size getPreviewSize();
public abstract void setPreviewSize(int width, int height);
+ public abstract void setCameraExtension(boolean enabled, int extension);
+ public abstract boolean isCameraExtension();
+ public abstract int getCameraExtension();
// whether to take a burst of images, and if so, what type
public enum BurstType {
BURSTTYPE_NONE, // no burst
@@ -400,6 +420,11 @@ public abstract class CameraController {
*/
public abstract void setExpoBracketingStops(double stops);
public abstract void setUseExpoFastBurst(boolean use_expo_fast_burst);
+ /** Whether to enable a workaround hack for some Galaxy devices - take an additional dummy photo
+ * when taking an expo/HDR burst, to avoid problem where manual exposure is ignored for the
+ * first image.
+ */
+ public abstract void setDummyCaptureHack(boolean dummy_capture_hack);
public abstract boolean isBurstOrExpo();
/** If true, then the camera controller is currently capturing a burst of images.
*/
@@ -478,6 +503,7 @@ public abstract class CameraController {
public abstract void setJpegQuality(int quality);
public abstract int getZoom();
public abstract void setZoom(int value);
+ public abstract void resetZoom(); // resets to zoom 1x
public abstract int getExposureCompensation();
public abstract boolean setExposureCompensation(int new_exposure);
public abstract void setPreviewFpsRange(int min, int max);
@@ -519,6 +545,7 @@ public abstract class CameraController {
public abstract List getFocusAreas();
public abstract List getMeteringAreas();
public abstract boolean supportsAutoFocus();
+ public abstract boolean supportsMetering();
public abstract boolean focusIsContinuous();
public abstract boolean focusIsVideo();
public abstract void reconnect() throws CameraControllerException;
diff --git a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController1.java b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController1.java
index 1310a17ef31369d9d45dc8dc344e058b5ecc2f95..379da2c2b1e741d510b25a56130d606069af5bbe 100644
--- a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController1.java
+++ b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraController1.java
@@ -435,6 +435,7 @@ public class CameraController1 extends CameraController {
catch(RuntimeException e) {
Log.e(TAG, "exception from getParameters");
e.printStackTrace();
+ count_camera_parameters_exception++;
return null;
}
List values = parameters.getSupportedSceneModes();
@@ -759,6 +760,21 @@ public class CameraController1 extends CameraController {
setCameraParameters(parameters);
}
+ @Override
+ public void setCameraExtension(boolean enabled, int extension) {
+ // not supported
+ }
+
+ @Override
+ public boolean isCameraExtension() {
+ return false;
+ }
+
+ @Override
+ public int getCameraExtension() {
+ return -1;
+ }
+
@Override
public void setBurstType(BurstType burst_type) {
if( MyDebug.LOG )
@@ -835,6 +851,11 @@ public class CameraController1 extends CameraController {
this.expo_bracketing_stops = stops;
}
+ @Override
+ public void setDummyCaptureHack(boolean dummy_capture_hack) {
+ // not supported for CameraController1
+ }
+
@Override
public void setUseExpoFastBurst(boolean use_expo_fast_burst) {
// not supported for CameraController1
@@ -890,8 +911,17 @@ public class CameraController1 extends CameraController {
}
public boolean getVideoStabilization() {
- Camera.Parameters parameters = this.getParameters();
- return parameters.getVideoStabilization();
+ try {
+ Camera.Parameters parameters = this.getParameters();
+ return parameters.getVideoStabilization();
+ }
+ catch(RuntimeException e) {
+ // have had crashes from Google Play for getParameters - assume video stabilization not enabled
+ Log.e(TAG, "failed to get parameters for video stabilization");
+ e.printStackTrace();
+ count_camera_parameters_exception++;
+ return false;
+ }
}
@Override
@@ -934,9 +964,15 @@ public class CameraController1 extends CameraController {
catch(RuntimeException e) {
Log.e(TAG, "failed to set parameters for zoom");
e.printStackTrace();
+ count_camera_parameters_exception++;
}
}
+ @Override
+ public void resetZoom() {
+ setZoom(0);
+ }
+
public int getExposureCompensation() {
/*Camera.Parameters parameters = this.getParameters();
return parameters.getExposureCompensation();*/
@@ -990,6 +1026,7 @@ public class CameraController1 extends CameraController {
// but here it doesn't really matter if we fail to set the fps range
Log.e(TAG, "setPreviewFpsRange failed to get parameters");
e.printStackTrace();
+ count_camera_parameters_exception++;
}
}
@@ -1012,6 +1049,7 @@ public class CameraController1 extends CameraController {
But that's a subclass of RuntimeException which we now catch anyway.
*/
e.printStackTrace();
+ count_camera_parameters_exception++;
}
return null;
}
@@ -1266,6 +1304,7 @@ public class CameraController1 extends CameraController {
// but here it doesn't really matter if we fail to set the recording hint
Log.e(TAG, "setRecordingHint failed to get parameters");
e.printStackTrace();
+ count_camera_parameters_exception++;
}
}
@@ -1372,9 +1411,14 @@ public class CameraController1 extends CameraController {
setCameraParameters(parameters);
}
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "metering areas not supported");
+ }
}
catch(RuntimeException e) {
e.printStackTrace();
+ count_camera_parameters_exception++;
}
return false;
}
@@ -1397,6 +1441,7 @@ public class CameraController1 extends CameraController {
}
catch(RuntimeException e) {
e.printStackTrace();
+ count_camera_parameters_exception++;
}
}
@@ -1437,6 +1482,20 @@ public class CameraController1 extends CameraController {
}
catch(RuntimeException e) {
e.printStackTrace();
+ count_camera_parameters_exception++;
+ }
+ return false;
+ }
+
+ @Override
+ public boolean supportsMetering() {
+ try {
+ Camera.Parameters parameters = this.getParameters();
+ return parameters.getMaxNumMeteringAreas() > 0;
+ }
+ catch(RuntimeException e) {
+ e.printStackTrace();
+ count_camera_parameters_exception++;
}
return false;
}
@@ -1454,6 +1513,7 @@ public class CameraController1 extends CameraController {
}
catch(RuntimeException e) {
e.printStackTrace();
+ count_camera_parameters_exception++;
}
return false;
}
@@ -1544,23 +1604,29 @@ public class CameraController1 extends CameraController {
catch(RuntimeException e) {
if( MyDebug.LOG )
Log.d(TAG, "face detection failed or already started");
+ count_camera_parameters_exception++;
return false;
}
return true;
}
public void setFaceDetectionListener(final CameraController.FaceDetectionListener listener) {
- class CameraFaceDetectionListener implements Camera.FaceDetectionListener {
- @Override
- public void onFaceDetection(Camera.Face[] camera_faces, Camera camera) {
- Face [] faces = new Face[camera_faces.length];
- for(int i=0;i zoom_ratios;
private int current_zoom_value;
+ private int zoom_value_1x; // index into zoom_ratios list that is for zoom 1x
private boolean supports_face_detect_mode_simple;
private boolean supports_face_detect_mode_full;
private boolean supports_optical_stabilization;
private boolean supports_photo_video_recording;
private boolean supports_white_balance_temperature;
+ private String initial_focus_mode; // if non-null, focus mode to use if not set by Preview (rather than relying on the Builder template's default, which can be one that isn't supported, at least on Android emulator with its LIMITED camera!)
+ private boolean supports_exposure_time;
+ private long min_exposure_time;
+ private long max_exposure_time;
private final static int tonemap_log_max_curve_points_c = 64;
private final static float [] jtvideo_values_base = new float[] {
@@ -149,7 +165,17 @@ public class CameraController2 extends CameraController {
private final ErrorCallback preview_error_cb;
private final ErrorCallback camera_error_cb;
- private CameraCaptureSession captureSession;
+
+ private enum SessionType {
+ SESSIONTYPE_NORMAL, // standard use of Camera2 API, via CameraCaptureSession
+ SESSIONTYPE_EXTENSION, // use of vendor extension, via CameraExtensionSession
+ }
+ private SessionType sessionType = SessionType.SESSIONTYPE_NORMAL;
+ //private SessionType sessionType = SessionType.SESSIONTYPE_EXTENSION; // test
+ private CameraCaptureSession captureSession; // used if sessionType == SESSIONTYPE_NORMAL
+ private CameraExtensionSession extensionSession; // used if sessionType == SESSIONTYPE_EXTENSION
+ private int camera_extension = 0; // used if sessionType == SESSIONTYPE_EXTENSION
+
private CaptureRequest.Builder previewBuilder;
private boolean previewIsVideoMode;
private AutoFocusCallback autofocus_cb;
@@ -181,12 +207,19 @@ public class CameraController2 extends CameraController {
private boolean continuous_burst_in_progress; // whether we're currently taking a continuous burst
private boolean continuous_burst_requested_last_capture; // whether we've requested the last capture
+ // Whether to enable a workaround hack for some Galaxy devices - take an additional dummy photo
+ // when taking an expo/HDR burst, to avoid problem where manual exposure is ignored for the
+ // first image.
+ private boolean dummy_capture_hack = false;
+ //private boolean dummy_capture_hack = true; // test
+
private boolean optimise_ae_for_dro = false;
private boolean want_raw;
//private boolean want_raw = true;
private int max_raw_images;
private android.util.Size raw_size;
private ImageReader imageReaderRaw;
+ private OnImageAvailableListener onImageAvailableListener;
private OnRawImageAvailableListener onRawImageAvailableListener;
private PictureCallback picture_cb;
private boolean jpeg_todo; // whether we are still waiting for JPEG images
@@ -213,6 +246,7 @@ public class CameraController2 extends CameraController {
private Surface surface_texture;
private HandlerThread thread;
private Handler handler;
+ private Executor executor;
private Surface video_recorder_surface;
private int preview_width;
@@ -265,8 +299,26 @@ public class CameraController2 extends CameraController {
/*private boolean capture_result_has_focus_distance;
private float capture_result_focus_distance_min;
private float capture_result_focus_distance_max;*/
- private final static long max_preview_exposure_time_c = 1000000000L/12;
-
+ /** Even if using long exposure, we want to set a maximum for the preview to avoid very low
+ * frame rates.
+ * Originally this was 1/12s, but I think we can get away with 1/5s - for this range, having
+ * a WYSIWYG preview is probably still better than the reduced framerate. Also as a side-benefit,
+ * it reduces the impact of the Samsung Galaxy Android 11 bug where manual exposure is ignored if
+ * different to the preview.
+ */
+ private final static long max_preview_exposure_time_c = 1000000000L/5;
+
+ private void resetCaptureResultInfo() {
+ capture_result_is_ae_scanning = false;
+ capture_result_ae = null;
+ is_flash_required = false;
+ capture_result_has_white_balance_rggb = false;
+ capture_result_has_iso = false;
+ capture_result_has_exposure_time = false;
+ capture_result_has_frame_duration = false;
+ capture_result_has_aperture = false;
+ }
+
private enum RequestTagType {
CAPTURE, // request is either for a regular non-burst capture, or the last of a burst capture sequence
CAPTURE_BURST_IN_PROGRESS // request is for a burst capture, but isn't the last of the burst capture sequence
@@ -310,9 +362,11 @@ public class CameraController2 extends CameraController {
private int antibanding = CameraMetadata.CONTROL_AE_ANTIBANDING_MODE_AUTO;
private boolean has_edge_mode;
private int edge_mode = CameraMetadata.EDGE_MODE_FAST;
+ private boolean has_default_edge_mode;
private Integer default_edge_mode;
private boolean has_noise_reduction_mode;
private int noise_reduction_mode = CameraMetadata.NOISE_REDUCTION_MODE_FAST;
+ private boolean has_default_noise_reduction_mode;
private Integer default_noise_reduction_mode;
private int white_balance_temperature = 5000; // used for white_balance == CONTROL_AWB_MODE_OFF
private String flash_value = "flash_off";
@@ -323,7 +377,9 @@ public class CameraController2 extends CameraController {
private long exposure_time = EXPOSURE_TIME_DEFAULT;
private boolean has_aperture;
private float aperture;
- private Rect scalar_crop_region; // no need for has_scalar_crop_region, as we can set to null instead
+ private boolean has_control_zoom_ratio; // zoom for Android 11+
+ private float control_zoom_ratio; // zoom for Android 11+
+ private Rect scalar_crop_region; // zoom for older Android versions; no need for has_scalar_crop_region, as we can set to null instead
private boolean has_ae_exposure_compensation;
private int ae_exposure_compensation;
private boolean has_af_mode;
@@ -332,8 +388,8 @@ public class CameraController2 extends CameraController {
private float focus_distance_manual; // saved setting when in manual mode (so if user switches to infinity mode and back, we'll still remember the manual focus distance)
private boolean ae_lock;
private boolean wb_lock;
- private MeteringRectangle [] af_regions; // no need for has_scalar_crop_region, as we can set to null instead
- private MeteringRectangle [] ae_regions; // no need for has_scalar_crop_region, as we can set to null instead
+ private MeteringRectangle [] af_regions; // no need for has_af_regions, as we can set to null instead
+ private MeteringRectangle [] ae_regions; // no need for has_ae_regions, as we can set to null instead
private boolean has_face_detect_mode;
private int face_detect_mode = CaptureRequest.STATISTICS_FACE_DETECT_MODE_OFF;
private Integer default_optical_stabilization;
@@ -384,13 +440,16 @@ public class CameraController2 extends CameraController {
//builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
//builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
- builder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_IDLE);
+ if( sessionType != SessionType.SESSIONTYPE_EXTENSION ) {
+ builder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_IDLE);
+ }
setSceneMode(builder);
setColorEffect(builder);
setWhiteBalance(builder);
setAntiBanding(builder);
setAEMode(builder, is_still);
+ setControlZoomRatio(builder);
setCropRegion(builder);
setExposureCompensation(builder);
setFocusMode(builder);
@@ -405,7 +464,10 @@ public class CameraController2 extends CameraController {
setTonemapProfile(builder);
if( is_still ) {
- if( location != null ) {
+ if( location != null && sessionType != SessionType.SESSIONTYPE_EXTENSION ) {
+ // JPEG_GPS_LOCATION not supported for camera extensions, so instead this must
+ // be set by the caller when receiving the image data (see ImageSaver.modifyExif(),
+ // where we do this using ExifInterface.setGpsInfo()).
builder.set(CaptureRequest.JPEG_GPS_LOCATION, location);
}
builder.set(CaptureRequest.JPEG_ORIENTATION, rotation);
@@ -423,11 +485,11 @@ public class CameraController2 extends CameraController {
/*builder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
builder.set(CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE, CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
builder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_OFF);
- if( Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
builder.set(CaptureRequest.TONEMAP_MODE, CaptureRequest.TONEMAP_MODE_GAMMA_VALUE);
builder.set(CaptureRequest.TONEMAP_GAMMA, 5.0f);
}*/
- /*if( Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.N ) {
+ /*if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
builder.set(CaptureRequest.CONTROL_POST_RAW_SENSITIVITY_BOOST, 0);
}*/
/*builder.set(CaptureRequest.CONTROL_EFFECT_MODE, CaptureRequest.CONTROL_EFFECT_MODE_OFF);
@@ -452,7 +514,7 @@ public class CameraController2 extends CameraController {
}
}
}*/
- /*if( Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M ) {
+ /*if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
builder.set(CaptureRequest.TONEMAP_MODE, CaptureRequest.TONEMAP_MODE_PRESET_CURVE);
builder.set(CaptureRequest.TONEMAP_PRESET_CURVE, CaptureRequest.TONEMAP_PRESET_CURVE_SRGB);
}*/
@@ -465,7 +527,7 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "edge_mode: " + (edge_mode==null ? "null" : edge_mode));
Integer cc_mode = builder.get(CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE);
Log.d(TAG, "cc_mode: " + (cc_mode==null ? "null" : cc_mode));
- /*if( Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.N ) {
+ /*if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
Integer raw_sensitivity_boost = builder.get(CaptureRequest.CONTROL_POST_RAW_SENSITIVITY_BOOST);
Log.d(TAG, "raw_sensitivity_boost: " + (raw_sensitivity_boost==null ? "null" : raw_sensitivity_boost));
}*/
@@ -480,6 +542,12 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "setSceneMode");
Log.d(TAG, "builder: " + builder);
}
+
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ return false;
+ }
+
Integer current_scene_mode = builder.get(CaptureRequest.CONTROL_SCENE_MODE);
if( has_face_detect_mode ) {
// face detection mode overrides scene mode
@@ -507,10 +575,13 @@ public class CameraController2 extends CameraController {
}
private boolean setColorEffect(CaptureRequest.Builder builder) {
- /*if( builder.get(CaptureRequest.CONTROL_EFFECT_MODE) == null && color_effect == CameraMetadata.CONTROL_EFFECT_MODE_OFF ) {
- // can leave off
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
}
- else*/ if( builder.get(CaptureRequest.CONTROL_EFFECT_MODE) == null || builder.get(CaptureRequest.CONTROL_EFFECT_MODE) != color_effect ) {
+ /*else if( builder.get(CaptureRequest.CONTROL_EFFECT_MODE) == null && color_effect == CameraMetadata.CONTROL_EFFECT_MODE_OFF ) {
+ // can leave off
+ }*/
+ else if( builder.get(CaptureRequest.CONTROL_EFFECT_MODE) == null || builder.get(CaptureRequest.CONTROL_EFFECT_MODE) != color_effect ) {
if( MyDebug.LOG )
Log.d(TAG, "setting color effect: " + color_effect);
builder.set(CaptureRequest.CONTROL_EFFECT_MODE, color_effect);
@@ -521,10 +592,13 @@ public class CameraController2 extends CameraController {
private boolean setWhiteBalance(CaptureRequest.Builder builder) {
boolean changed = false;
- /*if( builder.get(CaptureRequest.CONTROL_AWB_MODE) == null && white_balance == CameraMetadata.CONTROL_AWB_MODE_AUTO ) {
- // can leave off
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
}
- else*/ if( builder.get(CaptureRequest.CONTROL_AWB_MODE) == null || builder.get(CaptureRequest.CONTROL_AWB_MODE) != white_balance ) {
+ /*else if( builder.get(CaptureRequest.CONTROL_AWB_MODE) == null && white_balance == CameraMetadata.CONTROL_AWB_MODE_AUTO ) {
+ // can leave off
+ }*/
+ else if( builder.get(CaptureRequest.CONTROL_AWB_MODE) == null || builder.get(CaptureRequest.CONTROL_AWB_MODE) != white_balance ) {
if( MyDebug.LOG )
Log.d(TAG, "setting white balance: " + white_balance);
builder.set(CaptureRequest.CONTROL_AWB_MODE, white_balance);
@@ -534,7 +608,7 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG )
Log.d(TAG, "setting white balance temperature: " + white_balance_temperature);
// manual white balance
- RggbChannelVector rggbChannelVector = convertTemperatureToRggb(white_balance_temperature);
+ RggbChannelVector rggbChannelVector = convertTemperatureToRggbVector(white_balance_temperature);
builder.set(CaptureRequest.COLOR_CORRECTION_MODE, CameraMetadata.COLOR_CORRECTION_MODE_TRANSFORM_MATRIX);
builder.set(CaptureRequest.COLOR_CORRECTION_GAINS, rggbChannelVector);
changed = true;
@@ -544,7 +618,10 @@ public class CameraController2 extends CameraController {
private boolean setAntiBanding(CaptureRequest.Builder builder) {
boolean changed = false;
- if( has_antibanding ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_antibanding ) {
if( builder.get(CaptureRequest.CONTROL_AE_ANTIBANDING_MODE) == null || builder.get(CaptureRequest.CONTROL_AE_ANTIBANDING_MODE) != antibanding ) {
if( MyDebug.LOG )
Log.d(TAG, "setting antibanding: " + antibanding);
@@ -558,12 +635,17 @@ public class CameraController2 extends CameraController {
private boolean setEdgeMode(CaptureRequest.Builder builder) {
if( MyDebug.LOG ) {
Log.d(TAG, "setEdgeMode");
+ Log.d(TAG, "has_default_edge_mode: " + has_default_edge_mode);
Log.d(TAG, "default_edge_mode: " + default_edge_mode);
}
boolean changed = false;
- if( has_edge_mode ) {
- if( default_edge_mode == null ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_edge_mode ) {
+ if( !has_default_edge_mode ) {
// save the default_edge_mode edge_mode
+ has_default_edge_mode = true;
default_edge_mode = builder.get(CaptureRequest.EDGE_MODE);
if( MyDebug.LOG )
Log.d(TAG, "default_edge_mode: " + default_edge_mode);
@@ -587,7 +669,7 @@ public class CameraController2 extends CameraController {
// need EDGE_MODE_OFF to avoid a "glow" effect
builder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_OFF);
}
- else if( default_edge_mode != null ) {
+ else if( has_default_edge_mode ) {
if( builder.get(CaptureRequest.EDGE_MODE) != null && !builder.get(CaptureRequest.EDGE_MODE).equals(default_edge_mode) ) {
builder.set(CaptureRequest.EDGE_MODE, default_edge_mode);
changed = true;
@@ -599,12 +681,17 @@ public class CameraController2 extends CameraController {
private boolean setNoiseReductionMode(CaptureRequest.Builder builder) {
if( MyDebug.LOG ) {
Log.d(TAG, "setNoiseReductionMode");
+ Log.d(TAG, "has_default_noise_reduction_mode: " + has_default_noise_reduction_mode);
Log.d(TAG, "default_noise_reduction_mode: " + default_noise_reduction_mode);
}
boolean changed = false;
- if( has_noise_reduction_mode ) {
- if( default_noise_reduction_mode == null ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_noise_reduction_mode ) {
+ if( !has_default_noise_reduction_mode ) {
// save the default_noise_reduction_mode noise_reduction_mode
+ has_default_noise_reduction_mode = true;
default_noise_reduction_mode = builder.get(CaptureRequest.NOISE_REDUCTION_MODE);
if( MyDebug.LOG )
Log.d(TAG, "default_noise_reduction_mode: " + default_noise_reduction_mode);
@@ -628,7 +715,7 @@ public class CameraController2 extends CameraController {
// need NOISE_REDUCTION_MODE_OFF to avoid excessive blurring
builder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
}
- else if( default_noise_reduction_mode != null ) {
+ else if( has_default_noise_reduction_mode ) {
if( builder.get(CaptureRequest.NOISE_REDUCTION_MODE) != null && !builder.get(CaptureRequest.NOISE_REDUCTION_MODE).equals(default_noise_reduction_mode)) {
builder.set(CaptureRequest.NOISE_REDUCTION_MODE, default_noise_reduction_mode);
changed = true;
@@ -640,13 +727,16 @@ public class CameraController2 extends CameraController {
private boolean setAperture(CaptureRequest.Builder builder) {
if( MyDebug.LOG )
Log.d(TAG, "setAperture");
- // don't set at all if has_aperture==false
- if( has_aperture ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_aperture ) {
if( MyDebug.LOG )
Log.d(TAG, " aperture: " + aperture);
builder.set(CaptureRequest.LENS_APERTURE, aperture);
return true;
}
+ // don't set at all if has_aperture==false
return false;
}
@@ -654,6 +744,12 @@ public class CameraController2 extends CameraController {
private boolean setAEMode(CaptureRequest.Builder builder, boolean is_still) {
if( MyDebug.LOG )
Log.d(TAG, "setAEMode");
+
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ return false;
+ }
+
if( has_iso ) {
if( MyDebug.LOG ) {
Log.d(TAG, "manual mode");
@@ -741,8 +837,20 @@ public class CameraController2 extends CameraController {
return true;
}
+ private void setControlZoomRatio(CaptureRequest.Builder builder) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.R && has_control_zoom_ratio ) {
+ builder.set(CaptureRequest.CONTROL_ZOOM_RATIO, control_zoom_ratio);
+ }
+ }
+
private void setCropRegion(CaptureRequest.Builder builder) {
- if( scalar_crop_region != null ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( scalar_crop_region != null && Build.VERSION.SDK_INT < Build.VERSION_CODES.R ) {
builder.set(CaptureRequest.SCALER_CROP_REGION, scalar_crop_region);
}
}
@@ -755,6 +863,10 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "don't set exposure compensation in manual iso mode");
return false;
}
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ return false;
+ }
if( builder.get(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION) == null || ae_exposure_compensation != builder.get(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION) ) {
if( MyDebug.LOG )
Log.d(TAG, "change exposure to " + ae_exposure_compensation);
@@ -765,41 +877,73 @@ public class CameraController2 extends CameraController {
}
private void setFocusMode(CaptureRequest.Builder builder) {
- if( has_af_mode ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_af_mode ) {
if( MyDebug.LOG )
Log.d(TAG, "change af mode to " + af_mode);
builder.set(CaptureRequest.CONTROL_AF_MODE, af_mode);
}
+ else {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "af mode left at " + builder.get(CaptureRequest.CONTROL_AF_MODE));
+ }
+ }
}
private void setFocusDistance(CaptureRequest.Builder builder) {
if( MyDebug.LOG )
Log.d(TAG, "change focus distance to " + focus_distance);
- builder.set(CaptureRequest.LENS_FOCUS_DISTANCE, focus_distance);
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else {
+ builder.set(CaptureRequest.LENS_FOCUS_DISTANCE, focus_distance);
+ }
}
private void setAutoExposureLock(CaptureRequest.Builder builder) {
- builder.set(CaptureRequest.CONTROL_AE_LOCK, ae_lock);
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else {
+ builder.set(CaptureRequest.CONTROL_AE_LOCK, ae_lock);
+ }
}
private void setAutoWhiteBalanceLock(CaptureRequest.Builder builder) {
- builder.set(CaptureRequest.CONTROL_AWB_LOCK, wb_lock);
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else {
+ builder.set(CaptureRequest.CONTROL_AWB_LOCK, wb_lock);
+ }
}
private void setAFRegions(CaptureRequest.Builder builder) {
- if( af_regions != null && characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF) > 0 ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( af_regions != null && characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF) > 0 ) {
builder.set(CaptureRequest.CONTROL_AF_REGIONS, af_regions);
}
}
private void setAERegions(CaptureRequest.Builder builder) {
- if( ae_regions != null && characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) > 0 ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( ae_regions != null && characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) > 0 ) {
builder.set(CaptureRequest.CONTROL_AE_REGIONS, ae_regions);
}
}
private void setFaceDetectMode(CaptureRequest.Builder builder) {
- if( has_face_detect_mode )
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( has_face_detect_mode )
builder.set(CaptureRequest.STATISTICS_FACE_DETECT_MODE, face_detect_mode);
else
builder.set(CaptureRequest.STATISTICS_FACE_DETECT_MODE, CaptureRequest.STATISTICS_FACE_DETECT_MODE_OFF);
@@ -816,6 +960,12 @@ public class CameraController2 extends CameraController {
private void setStabilization(CaptureRequest.Builder builder) {
if( MyDebug.LOG )
Log.d(TAG, "setStabilization: " + video_stabilization);
+
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ return;
+ }
+
builder.set(CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE, video_stabilization ? CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE_ON : CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE_OFF);
if( supports_optical_stabilization ) {
if( video_stabilization ) {
@@ -885,7 +1035,10 @@ public class CameraController2 extends CameraController {
//if( test_new )
// have_tonemap_profile = false;
- if( have_tonemap_profile ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // don't set for extensions
+ }
+ else if( have_tonemap_profile ) {
if( default_tonemap_mode == null ) {
// save the default tonemap_mode
default_tonemap_mode = builder.get(CaptureRequest.TONEMAP_MODE);
@@ -1125,9 +1278,26 @@ public class CameraController2 extends CameraController {
// n.b., if we add more methods, remember to update setupBuilder() above!
}
+ private boolean hasCaptureSession() {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION )
+ return extensionSession != null;
+ return captureSession != null;
+ }
+
+ private void BLOCK_FOR_EXTENSIONS() {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ throw new RuntimeException("not supported for extension session");
+ }
+ }
+
+ private static RggbChannelVector convertTemperatureToRggbVector(int temperature_kelvin) {
+ float [] rggb = convertTemperatureToRggb(temperature_kelvin);
+ return new RggbChannelVector(rggb[0], rggb[1], rggb[2], rggb[3]);
+ }
+
/** Converts a white balance temperature to red, green even, green odd and blue components.
*/
- private RggbChannelVector convertTemperatureToRggb(int temperature_kelvin) {
+ public static float [] convertTemperatureToRggb(int temperature_kelvin) {
float temperature = temperature_kelvin / 100.0f;
float red;
float green;
@@ -1180,63 +1350,90 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "green: " + green);
Log.d(TAG, "blue: " + blue);
}
- return new RggbChannelVector((red/255)*2,(green/255),(green/255),(blue/255)*2);
+
+ red = (red/255.0f);
+ green = (green/255.0f);
+ blue = (blue/255.0f);
+
+ red = RGBtoGain(red);
+ green = RGBtoGain(green);
+ blue = RGBtoGain(blue);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "red gain: " + red);
+ Log.d(TAG, "green gain: " + green);
+ Log.d(TAG, "blue gain: " + blue);
+ }
+
+ return new float[]{red,green/2,green/2,blue};
+ }
+
+ private static float RGBtoGain(float value) {
+ final float max_gain_c = 10.0f;
+ if( value < 1.0e-5f ) {
+ return max_gain_c;
+ }
+ value = 1.0f/value;
+ value = Math.min(max_gain_c, value);
+ return value;
+ }
+
+ public static int convertRggbVectorToTemperature(RggbChannelVector rggbChannelVector) {
+ return convertRggbToTemperature(new float[]{rggbChannelVector.getRed(), rggbChannelVector.getGreenEven(), rggbChannelVector.getGreenOdd(), rggbChannelVector.getBlue()});
}
/** Converts a red, green even, green odd and blue components to a white balance temperature.
* Note that this is not necessarily an inverse of convertTemperatureToRggb, since many rggb
* values can map to the same temperature.
*/
- private int convertRggbToTemperature(RggbChannelVector rggbChannelVector) {
+ public static int convertRggbToTemperature(float [] rggb) {
if( MyDebug.LOG ) {
Log.d(TAG, "temperature:");
- Log.d(TAG, " red: " + rggbChannelVector.getRed());
- Log.d(TAG, " green even: " + rggbChannelVector.getGreenEven());
- Log.d(TAG, " green odd: " + rggbChannelVector.getGreenOdd());
- Log.d(TAG, " blue: " + rggbChannelVector.getBlue());
- }
- float red = rggbChannelVector.getRed();
- float green_even = rggbChannelVector.getGreenEven();
- float green_odd = rggbChannelVector.getGreenOdd();
- float blue = rggbChannelVector.getBlue();
- float green = 0.5f*(green_even + green_odd);
-
- float max = Math.max(red, blue);
- if( green > max )
- green = max;
-
- float scale = 255.0f/max;
- red *= scale;
- green *= scale;
- blue *= scale;
-
- int red_i = (int)red;
- int green_i = (int)green;
- int blue_i = (int)blue;
+ Log.d(TAG, " red: " + rggb[0]);
+ Log.d(TAG, " green even: " + rggb[1]);
+ Log.d(TAG, " green odd: " + rggb[2]);
+ Log.d(TAG, " blue: " + rggb[3]);
+ }
+ float red = rggb[0];
+ float green_even = rggb[1];
+ float green_odd = rggb[2];
+ float blue = rggb[3];
+ float green = (green_even + green_odd);
+
+ red = GaintoRGB(red);
+ green = GaintoRGB(green);
+ blue = GaintoRGB(blue);
+
+ red *= 255.0f;
+ green *= 255.0f;
+ blue *= 255.0f;
+
+ int red_i = (int)(red+0.5f);
+ int green_i = (int)(green+0.5f);
+ int blue_i = (int)(blue+0.5f);
int temperature;
if( red_i == blue_i ) {
temperature = 6600;
}
else if( red_i > blue_i ) {
// temperature <= 6600
- int t_g = (int)( 100 * Math.exp((green_i + 161.1195681661) / 99.4708025861) );
+ float t_g = (float)( 100 * Math.exp((green + 161.1195681661) / 99.4708025861) );
if( blue_i == 0 ) {
- temperature = t_g;
+ temperature = (int)(t_g+0.5f);
}
else {
- int t_b = (int)( 100 * (Math.exp((blue_i + 305.0447927307) / 138.5177312231) + 10) );
- temperature = (t_g + t_b)/2;
+ float t_b = (float)( 100 * (Math.exp((blue + 305.0447927307) / 138.5177312231) + 10) );
+ temperature = (int)((t_g + t_b)/2+0.5f);
}
}
else {
- // temperature >= 6700
+ // temperature >= 6600
if( red_i <= 1 || green_i <= 1 ) {
temperature = max_white_balance_temperature_c;
}
else {
- int t_r = (int)(100 * (Math.pow(red_i / 329.698727446, 1.0 / -0.1332047592) + 60.0));
- int t_g = (int)(100 * (Math.pow(green_i / 288.1221695283, 1.0 / -0.0755148492) + 60.0));
- temperature = (t_r + t_g) / 2;
+ float t_r = (float)(100 * (Math.pow(red / 329.698727446, 1.0 / -0.1332047592) + 60.0));
+ float t_g = (float)(100 * (Math.pow(green / 288.1221695283, 1.0 / -0.0755148492) + 60.0));
+ temperature = (int)((t_r + t_g)/2+0.5f);
}
}
temperature = Math.max(temperature, min_white_balance_temperature_c);
@@ -1247,7 +1444,17 @@ public class CameraController2 extends CameraController {
return temperature;
}
+ private static float GaintoRGB(float value) {
+ if( value <= 1.0f ) {
+ return 1.0f;
+ }
+ value = 1.0f/value;
+ return value;
+ }
+
private class OnImageAvailableListener implements ImageReader.OnImageAvailableListener {
+ private boolean skip_next_image = false; // whether to ignore the next image (used for dummy_capture_hack)
+
@Override
public void onImageAvailable(ImageReader reader) {
if( MyDebug.LOG )
@@ -1260,6 +1467,14 @@ public class CameraController2 extends CameraController {
image.close();
return;
}
+ if( skip_next_image ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "skipping image");
+ skip_next_image = false;
+ Image image = reader.acquireNextImage();
+ image.close();
+ return;
+ }
List single_burst_complete_images = null;
boolean call_takePhotoPartial = false;
@@ -1362,6 +1577,7 @@ public class CameraController2 extends CameraController {
private void takePhotoPartial() {
if( MyDebug.LOG )
Log.d(TAG, "takePhotoPartial");
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
ErrorCallback push_take_picture_error_cb = null;
@@ -1373,7 +1589,7 @@ public class CameraController2 extends CameraController {
}
if( burst_type != BurstType.BURSTTYPE_FOCUS ) {
try {
- if( camera != null && captureSession != null ) { // make sure camera wasn't released in the meantime
+ if( camera != null && hasCaptureSession() ) { // make sure camera wasn't released in the meantime
captureSession.capture(slow_burst_capture_requests.get(n_burst_taken), previewCaptureCallback, handler);
}
}
@@ -1456,7 +1672,7 @@ public class CameraController2 extends CameraController {
public void run(){
if( MyDebug.LOG )
Log.d(TAG, "take picture after delay for next focus bracket");
- if( camera != null && captureSession != null ) { // make sure camera wasn't released in the meantime
+ if( camera != null && hasCaptureSession() ) { // make sure camera wasn't released in the meantime
if( picture_cb.imageQueueWouldBlock(imageReaderRaw != null ? 1 : 0, 1) ) {
if( MyDebug.LOG ) {
Log.d(TAG, "...but wait for next focus bracket, as image queue would block");
@@ -1519,6 +1735,7 @@ public class CameraController2 extends CameraController {
private class OnRawImageAvailableListener implements ImageReader.OnImageAvailableListener {
private final Queue capture_results = new LinkedList<>();
private final Queue images = new LinkedList<>();
+ private boolean skip_next_image = false; // whether to ignore the next image (used for dummy_capture_hack)
void setCaptureResult(CaptureResult capture_result) {
if( MyDebug.LOG )
@@ -1685,6 +1902,14 @@ public class CameraController2 extends CameraController {
this_image.close();
return;
}
+ if( skip_next_image ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "skipping image");
+ skip_next_image = false;
+ Image image = reader.acquireNextImage();
+ image.close();
+ return;
+ }
synchronized( background_camera_lock ) {
// see comment above in setCaptureResult() for why we synchronize
Image image = reader.acquireNextImage();
@@ -1745,25 +1970,42 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "this: " + this);
}
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ this.previewExtensionCaptureCallback = new MyExtensionCaptureCallback();
+ }
+ else {
+ this.previewExtensionCaptureCallback = null;
+ }
+
this.context = context;
this.preview_error_cb = preview_error_cb;
this.camera_error_cb = camera_error_cb;
+ this.is_oneplus = Build.MANUFACTURER.toLowerCase(Locale.US).contains("oneplus");
this.is_samsung = Build.MANUFACTURER.toLowerCase(Locale.US).contains("samsung");
this.is_samsung_s7 = Build.MODEL.toLowerCase(Locale.US).contains("sm-g93");
+ this.is_samsung_galaxy_s = is_samsung && Build.MODEL.toLowerCase(Locale.US).contains("sm-g");
if( MyDebug.LOG ) {
+ Log.d(TAG, "is_oneplus: " + is_oneplus);
Log.d(TAG, "is_samsung: " + is_samsung);
Log.d(TAG, "is_samsung_s7: " + is_samsung_s7);
+ Log.d(TAG, "is_samsung_galaxy_s: " + is_samsung_galaxy_s);
}
thread = new HandlerThread("CameraBackground");
thread.start();
handler = new Handler(thread.getLooper());
+ executor = new Executor() {
+ @Override
+ public void execute(Runnable command) {
+ handler.post(command);
+ }
+ };
final CameraManager manager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);
class MyStateCallback extends CameraDevice.StateCallback {
- boolean callback_done; // must sychronize on this and notifyAll when setting to true
+ boolean callback_done; // must synchronize on this and notifyAll when setting to true
boolean first_callback = true; // Google Camera says we may get multiple callbacks, but only the first indicates the status of the camera opening operation
@Override
public void onOpened(@NonNull CameraDevice cam) {
@@ -1805,6 +2047,12 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "characteristics_facing: " + characteristics_facing);
}
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ extension_characteristics = manager.getCameraExtensionCharacteristics(cameraIdS);
+ if( MyDebug.LOG )
+ Log.d(TAG, "successfully obtained camera characteristics");
+ }
+
CameraController2.this.camera = cam;
// note, this won't start the preview yet, but we create the previewBuilder in order to start setting camera parameters
@@ -2033,17 +2281,37 @@ public class CameraController2 extends CameraController {
jtlog2_values = enforceMinTonemapCurvePoints(jtlog2_values_base);
}
- @Override
- public void release() {
- if( MyDebug.LOG )
- Log.d(TAG, "release: " + this);
+ /** Closes the captureSession, if it exists.
+ */
+ private void closeCaptureSession() {
synchronized( background_camera_lock ) {
if( captureSession != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "close capture session");
captureSession.close();
captureSession = null;
- //pending_request_when_ready = null;
+ }
+ if( extensionSession != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "close extension session");
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ try {
+ extensionSession.close();
+ }
+ catch(CameraAccessException e) {
+ e.printStackTrace();
+ }
+ }
+ extensionSession = null;
}
}
+ }
+
+ @Override
+ public void release() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "release: " + this);
+ closeCaptureSession();
previewBuilder = null;
previewIsVideoMode = false;
if( camera != null ) {
@@ -2063,6 +2331,7 @@ public class CameraController2 extends CameraController {
thread.join();
thread = null;
handler = null;
+ executor = null;
}
catch(InterruptedException e) {
e.printStackTrace();
@@ -2145,6 +2414,7 @@ public class CameraController2 extends CameraController {
if( imageReader != null ) {
imageReader.close();
imageReader = null;
+ onImageAvailableListener = null;
}
if( imageReaderRaw != null ) {
imageReaderRaw.close();
@@ -2154,10 +2424,15 @@ public class CameraController2 extends CameraController {
}
private List convertFocusModesToValues(int [] supported_focus_modes_arr, float minimum_focus_distance) {
- if( MyDebug.LOG )
+ if( MyDebug.LOG ) {
Log.d(TAG, "convertFocusModesToValues()");
- if( supported_focus_modes_arr.length == 0 )
+ Log.d(TAG, "supported_focus_modes_arr: " + Arrays.toString(supported_focus_modes_arr));
+ }
+ if( supported_focus_modes_arr.length == 0 ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "no supported focus modes");
return null;
+ }
List supported_focus_modes = new ArrayList<>();
for(Integer supported_focus_mode : supported_focus_modes_arr)
supported_focus_modes.add(supported_focus_mode);
@@ -2182,6 +2457,9 @@ public class CameraController2 extends CameraController {
}
if( supported_focus_modes.contains(CaptureRequest.CONTROL_AF_MODE_OFF) ) {
output_modes.add("focus_mode_infinity");
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " supports focus_mode_infinity");
+ }
if( minimum_focus_distance > 0.0f ) {
output_modes.add("focus_mode_manual2");
if( MyDebug.LOG ) {
@@ -2247,33 +2525,127 @@ public class CameraController2 extends CameraController {
for(int i=0;i= Build.VERSION_CODES.R ) {
+ Capability [] capabilities = characteristics.get(CameraCharacteristics.CONTROL_AVAILABLE_EXTENDED_SCENE_MODE_CAPABILITIES);
+ Log.d(TAG, "capabilities:");
+ if( capabilities == null ) {
+ Log.d(TAG, " none");
+ }
+ else {
+ for(int i=0;i 0.0f;
- if( MyDebug.LOG )
+ float min_zoom = 0.0f;
+ float max_zoom = 0.0f;
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.R ) {
+ // use CONTROL_ZOOM_RATIO_RANGE on Android 11+, to support multiple cameras with zoom ratios
+ // less than 1
+ Range zoom_ratio_range = characteristics.get(CameraCharacteristics.CONTROL_ZOOM_RATIO_RANGE);
+ if( zoom_ratio_range != null ) {
+ min_zoom = zoom_ratio_range.getLower();
+ max_zoom = zoom_ratio_range.getUpper();
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "zoom_ratio_range not supported");
+ }
+ }
+ else {
+ min_zoom = 1.0f;
+ max_zoom = characteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM);
+ }
+ camera_features.is_zoom_supported = max_zoom > 0.0f && min_zoom > 0.0f;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "min_zoom: " + min_zoom);
Log.d(TAG, "max_zoom: " + max_zoom);
+ }
if( camera_features.is_zoom_supported ) {
+ float zoom_max_min_ratio = max_zoom / min_zoom;
// set 20 steps per 2x factor
final int steps_per_2x_factor = 20;
- //final double scale_factor = Math.pow(2.0, 1.0/(double)steps_per_2x_factor);
- int n_steps =(int)( (steps_per_2x_factor * Math.log(max_zoom + 1.0e-11)) / Math.log(2.0));
- final double scale_factor = Math.pow(max_zoom, 1.0/(double)n_steps);
+ int n_steps = (int)( (steps_per_2x_factor * Math.log(zoom_max_min_ratio + 1.0e-11)) / Math.log(2.0));
if( MyDebug.LOG ) {
Log.d(TAG, "n_steps: " + n_steps);
- Log.d(TAG, "scale_factor: " + scale_factor);
}
+
camera_features.zoom_ratios = new ArrayList<>();
- camera_features.zoom_ratios.add(100);
- double zoom = 1.0;
- for(int i=0;i camera_features.zoom_ratios.get(0) ) {
+ // on some devices (e.g., Pixel 6 Pro), the second entry would equal the first entry, due to the rounding fix above
+ camera_features.zoom_ratios.add(zoom_ratio);
+ }
+ }
+
+ // add values for 1.0f
+ zoom_value_1x = camera_features.zoom_ratios.size();
+ for(int i=0;i 1.0f
+ double zoom = 1.0f;
+ final double scale_factor = Math.pow(max_zoom, 1.0/(double)n_steps_above_one);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "scale_factor for above 1.0x: " + scale_factor);
+ }
+ for(int i=0;i();
- if( android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
android.util.Size [] camera_picture_sizes_hires = configs.getHighResolutionOutputSizes(ImageFormat.JPEG);
if( camera_picture_sizes_hires != null ) {
for(android.util.Size camera_size : camera_picture_sizes_hires) {
@@ -2584,6 +2956,69 @@ public class CameraController2 extends CameraController {
}
}
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ List extensions = extension_characteristics.getSupportedExtensions();
+ if( extensions != null ) {
+ camera_features.supported_extensions = new ArrayList<>();
+ for(int extension : extensions) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "vendor extension: " + extension);
+
+ // we assume that the allowed extension sizes are a subset of the full sizes - makes things easier to manage
+
+ List extension_picture_sizes = extension_characteristics.getExtensionSupportedSizes(extension, ImageFormat.JPEG);
+ if( MyDebug.LOG )
+ Log.d(TAG, " extension_picture_sizes: " + extension_picture_sizes);
+ boolean has_picture_resolution = false;
+ for(CameraController.Size size : camera_features.picture_sizes) {
+ if( extension_picture_sizes.contains(new android.util.Size(size.width, size.height)) ) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " picture size supports extension: " + size.width + " , " + size.height);
+ }
+ has_picture_resolution = true;
+ if( size.supported_extensions == null ) {
+ size.supported_extensions = new ArrayList<>();
+ }
+ size.supported_extensions.add(extension);
+ }
+ else {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " picture size does NOT support extension: " + size.width + " , " + size.height);
+ }
+ }
+ }
+
+ List extension_preview_sizes = extension_characteristics.getExtensionSupportedSizes(extension, SurfaceTexture.class);
+ if( MyDebug.LOG )
+ Log.d(TAG, " extension_preview_sizes: " + extension_preview_sizes);
+ boolean has_preview_resolution = false;
+ for(CameraController.Size size : camera_features.preview_sizes) {
+ if( extension_preview_sizes.contains(new android.util.Size(size.width, size.height)) ) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " preview size supports extension: " + size.width + " , " + size.height);
+ }
+ has_preview_resolution = true;
+ if( size.supported_extensions == null ) {
+ size.supported_extensions = new ArrayList<>();
+ }
+ size.supported_extensions.add(extension);
+ }
+ else {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " preview size does NOT support extension: " + size.width + " , " + size.height);
+ }
+ }
+ }
+
+ if( has_picture_resolution && has_preview_resolution ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, " extension is supported: " + extension);
+ camera_features.supported_extensions.add(extension);
+ }
+ }
+ }
+ }
+
if( characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE) ) {
camera_features.supported_flash_values = new ArrayList<>();
camera_features.supported_flash_values.add("flash_off");
@@ -2617,6 +3052,22 @@ public class CameraController2 extends CameraController {
if( camera_features.supported_focus_values != null && camera_features.supported_focus_values.contains("focus_mode_manual2") ) {
camera_features.supports_focus_bracketing = true;
}
+ if( camera_features.supported_focus_values != null ) {
+ // prefer continuous focus mode
+ if( camera_features.supported_focus_values.contains("focus_mode_continuous_picture") ) {
+ initial_focus_mode = "focus_mode_continuous_picture";
+ }
+ else {
+ // just go with the first one
+ initial_focus_mode = camera_features.supported_focus_values.get(0);
+ }
+ if( MyDebug.LOG )
+ Log.d(TAG, "initial_focus_mode: " + initial_focus_mode);
+ }
+ else {
+ initial_focus_mode = null;
+ }
+
camera_features.max_num_focus_areas = characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF);
camera_features.is_exposure_lock_supported = true;
@@ -2686,9 +3137,22 @@ public class CameraController2 extends CameraController {
camera_features.max_expo_bracketing_n_images = max_expo_bracketing_n_images;
camera_features.min_exposure_time = exposure_time_range.getLower();
camera_features.max_exposure_time = exposure_time_range.getUpper();
+ if( is_samsung_galaxy_s && Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ) {
+ // seems we can get away with longer exposure on some devices (e.g., Galaxy S10e claims only max of 0.1s, but works with 1/3s)
+ // but Android 11 on Samsung devices also introduces a bug where manual exposure gets ignored if different to the preview,
+ // and since the max preview rate is limited to 1/5s (see max_preview_exposure_time_c), there's no point
+ // going above this!
+ if( MyDebug.LOG )
+ Log.d(TAG, "boost max_exposure_time, was: " + max_exposure_time);
+ camera_features.max_exposure_time = Math.max(camera_features.max_exposure_time, 1000000000L/5);
+ }
}
}
}
+ // save to local fields:
+ this.supports_exposure_time = camera_features.supports_exposure_time;
+ this.min_exposure_time = camera_features.min_exposure_time;
+ this.max_exposure_time = camera_features.max_exposure_time;
Range exposure_range = characteristics.get(CameraCharacteristics.CONTROL_AE_COMPENSATION_RANGE);
camera_features.min_exposure = exposure_range.getLower();
@@ -3686,7 +4150,7 @@ public class CameraController2 extends CameraController {
Log.e(TAG, "no camera");
return;
}
- if( captureSession != null ) {
+ if( hasCaptureSession() ) {
// can only call this when captureSession not created - as the surface of the imageReader we create has to match the surface we pass to the captureSession
if( MyDebug.LOG )
Log.e(TAG, "can't set picture size when captureSession running!");
@@ -3715,7 +4179,7 @@ public class CameraController2 extends CameraController {
Log.e(TAG, "can't set raw when raw not supported");
return;
}
- if( captureSession != null ) {
+ if( hasCaptureSession() ) {
// can only call this when captureSession not created - as it affects how we create the imageReader
if( MyDebug.LOG )
Log.e(TAG, "can't set raw when captureSession running!");
@@ -3737,7 +4201,7 @@ public class CameraController2 extends CameraController {
if( this.want_video_high_speed == want_video_high_speed ) {
return;
}
- if( captureSession != null ) {
+ if( hasCaptureSession() ) {
// can only call this when captureSession not created - as it affects how we create the session
if( MyDebug.LOG )
Log.e(TAG, "can't set high speed when captureSession running!");
@@ -3747,6 +4211,62 @@ public class CameraController2 extends CameraController {
this.is_video_high_speed = false; // reset just to be safe
}
+ @Override
+ public void setCameraExtension(boolean enabled, int extension) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "setCameraExtension?: " + enabled);
+ Log.d(TAG, "extension: " + extension);
+ }
+
+ if( camera == null ) {
+ if( MyDebug.LOG )
+ Log.e(TAG, "no camera");
+ return;
+ }
+ if( hasCaptureSession() ) {
+ // can only call this when captureSession not created - as it affects how we create the imageReader
+ if( MyDebug.LOG )
+ Log.e(TAG, "can't set extension when captureSession running!");
+ throw new RuntimeException(); // throw as RuntimeException, as this is a programming error
+ }
+
+ if( enabled != (sessionType == SessionType.SESSIONTYPE_EXTENSION) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "turning extension session on or off");
+ // Ideally we'd probably only create the previewBuilder when starting the preview (so we
+ // start off with a "fresh" one), but for now at least ensure we start off with a fresh
+ // previewBuilder when enabling extensions (and might as well do so when disabling
+ // extensions too).
+ // This saves us having to set capture request parameters back to their defaults, and is
+ // also useful for modes like CONTROL_AE_ANTIBANDING_MODE where there isn't an obvious
+ // "default" to set (in theory extensions mode should just ignore such keys, but it'd be
+ // nicer to never set them).
+ previewBuilder = null;
+ createPreviewRequest();
+ }
+
+ if( enabled ) {
+ this.sessionType = SessionType.SESSIONTYPE_EXTENSION;
+ this.camera_extension = extension;
+ }
+ else {
+ this.sessionType = SessionType.SESSIONTYPE_NORMAL;
+ this.camera_extension = 0;
+ }
+ }
+
+ @Override
+ public boolean isCameraExtension() {
+ return this.sessionType == SessionType.SESSIONTYPE_EXTENSION;
+ }
+
+ @Override
+ public int getCameraExtension() {
+ if( isCameraExtension() )
+ return camera_extension;
+ return -1;
+ }
+
@Override
public void setBurstType(BurstType burst_type) {
if( MyDebug.LOG )
@@ -3759,7 +4279,7 @@ public class CameraController2 extends CameraController {
if( this.burst_type == burst_type ) {
return;
}
- /*if( captureSession != null ) {
+ /*if( hasCaptureSession() ) {
// can only call this when captureSession not created - as it affects how we create the imageReader
if( MyDebug.LOG )
Log.e(TAG, "can't set burst type when captureSession running!");
@@ -3782,7 +4302,7 @@ public class CameraController2 extends CameraController {
if( n_images <= 1 || (n_images % 2) == 0 ) {
if( MyDebug.LOG )
Log.e(TAG, "n_images should be an odd number greater than 1");
- throw new RuntimeException(); // throw as RuntimeException, as this is a programming error
+ throw new RuntimeException("n_images should be an odd number greater than 1"); // throw as RuntimeException, as this is a programming error
}
if( n_images > max_expo_bracketing_n_images ) {
n_images = max_expo_bracketing_n_images;
@@ -3804,6 +4324,13 @@ public class CameraController2 extends CameraController {
this.expo_bracketing_stops = stops;
}
+ @Override
+ public void setDummyCaptureHack(boolean dummy_capture_hack) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "setDummyCaptureHack: " + dummy_capture_hack);
+ this.dummy_capture_hack = dummy_capture_hack;
+ }
+
@Override
public void setUseExpoFastBurst(boolean use_expo_fast_burst) {
if( MyDebug.LOG )
@@ -3836,18 +4363,25 @@ public class CameraController2 extends CameraController {
return 0; // total burst size is unknown
return n_burst_total;
}
+
@Override
public void setOptimiseAEForDRO(boolean optimise_ae_for_dro) {
if( MyDebug.LOG )
Log.d(TAG, "setOptimiseAEForDRO: " + optimise_ae_for_dro);
- boolean is_oneplus = Build.MANUFACTURER.toLowerCase(Locale.US).contains("oneplus");
if( is_oneplus ) {
- // OnePlus 3T has preview corruption / camera freezing problems when using manual shutter speeds
- // So best not to modify auto-exposure for DRO
+ // OnePlus 3T has preview corruption / camera freezing problems when using manual shutter speeds.
+ // So best not to modify auto-exposure for DRO.
this.optimise_ae_for_dro = false;
if( MyDebug.LOG )
Log.d(TAG, "don't modify ae for OnePlus");
}
+ else if( is_samsung ) {
+ // At least some Samsung devices (e.g., Galaxy S10e on Android 11) give better results in auto mode
+ // than manual mode, so we're better off staying in auto mode.
+ this.optimise_ae_for_dro = false;
+ if( MyDebug.LOG )
+ Log.d(TAG, "don't modify ae for Samsung");
+ }
else {
this.optimise_ae_for_dro = optimise_ae_for_dro;
}
@@ -3919,7 +4453,7 @@ public class CameraController2 extends CameraController {
private void createPictureImageReader() {
if( MyDebug.LOG )
Log.d(TAG, "createPictureImageReader");
- if( captureSession != null ) {
+ if( hasCaptureSession() ) {
// can only call this when captureSession not created - as the surface of the imageReader we create has to match the surface we pass to the captureSession
if( MyDebug.LOG )
Log.e(TAG, "can't create picture image reader when captureSession running!");
@@ -3935,19 +4469,19 @@ public class CameraController2 extends CameraController {
imageReader = ImageReader.newInstance(picture_width, picture_height, ImageFormat.JPEG, 2);
//imageReader = ImageReader.newInstance(picture_width, picture_height, ImageFormat.YUV_420_888, 2);
if( MyDebug.LOG ) {
- Log.d(TAG, "created new imageReader: " + imageReader.toString());
+ Log.d(TAG, "created new imageReader: " + imageReader);
Log.d(TAG, "imageReader surface: " + imageReader.getSurface().toString());
}
// It's intentional that we pass a handler on null, so the OnImageAvailableListener runs on the UI thread.
// If ever we want to change this on future, we should ensure that all image available listeners (JPEG+RAW) are
// using the same handler/thread.
- imageReader.setOnImageAvailableListener(new OnImageAvailableListener(), null);
+ imageReader.setOnImageAvailableListener(onImageAvailableListener = new OnImageAvailableListener(), null);
if( want_raw && raw_size != null&& !previewIsVideoMode ) {
// unlike the JPEG imageReader, we can't read the data and close the image straight away, so we need to allow a larger
// value for maxImages
imageReaderRaw = ImageReader.newInstance(raw_size.getWidth(), raw_size.getHeight(), ImageFormat.RAW_SENSOR, max_raw_images);
if( MyDebug.LOG ) {
- Log.d(TAG, "created new imageReaderRaw: " + imageReaderRaw.toString());
+ Log.d(TAG, "created new imageReaderRaw: " + imageReaderRaw);
Log.d(TAG, "imageReaderRaw surface: " + imageReaderRaw.getSurface().toString());
}
// see note above for imageReader.setOnImageAvailableListener for why we use a null handler
@@ -3961,8 +4495,12 @@ public class CameraController2 extends CameraController {
pending_burst_images.clear();
pending_burst_images_raw.clear();
pending_raw_image = null;
+ if( onImageAvailableListener != null ) {
+ onImageAvailableListener.skip_next_image = false;
+ }
if( onRawImageAvailableListener != null ) {
onRawImageAvailableListener.clear();
+ onRawImageAvailableListener.skip_next_image = false;
}
slow_burst_capture_requests = null;
n_burst = 0;
@@ -4178,37 +4716,45 @@ public class CameraController2 extends CameraController {
throw new RuntimeException(); // throw as RuntimeException, as this is a programming error
}
float zoom = zoom_ratios.get(value)/100.0f;
- Rect sensor_rect = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
- int left = sensor_rect.width()/2;
- int right = left;
- int top = sensor_rect.height()/2;
- int bottom = top;
- int hwidth = (int)(sensor_rect.width() / (2.0*zoom));
- int hheight = (int)(sensor_rect.height() / (2.0*zoom));
- left -= hwidth;
- right += hwidth;
- top -= hheight;
- bottom += hheight;
- if( MyDebug.LOG ) {
- Log.d(TAG, "zoom: " + zoom);
- Log.d(TAG, "hwidth: " + hwidth);
- Log.d(TAG, "hheight: " + hheight);
- Log.d(TAG, "sensor_rect left: " + sensor_rect.left);
- Log.d(TAG, "sensor_rect top: " + sensor_rect.top);
- Log.d(TAG, "sensor_rect right: " + sensor_rect.right);
- Log.d(TAG, "sensor_rect bottom: " + sensor_rect.bottom);
- Log.d(TAG, "left: " + left);
- Log.d(TAG, "top: " + top);
- Log.d(TAG, "right: " + right);
- Log.d(TAG, "bottom: " + bottom);
+
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.R ) {
+ camera_settings.has_control_zoom_ratio = true;
+ camera_settings.control_zoom_ratio = zoom;
+ camera_settings.setControlZoomRatio(previewBuilder);
+ }
+ else {
+ Rect sensor_rect = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
+ int left = sensor_rect.width()/2;
+ int right = left;
+ int top = sensor_rect.height()/2;
+ int bottom = top;
+ int hwidth = (int)(sensor_rect.width() / (2.0*zoom));
+ int hheight = (int)(sensor_rect.height() / (2.0*zoom));
+ left -= hwidth;
+ right += hwidth;
+ top -= hheight;
+ bottom += hheight;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "zoom: " + zoom);
+ Log.d(TAG, "hwidth: " + hwidth);
+ Log.d(TAG, "hheight: " + hheight);
+ Log.d(TAG, "sensor_rect left: " + sensor_rect.left);
+ Log.d(TAG, "sensor_rect top: " + sensor_rect.top);
+ Log.d(TAG, "sensor_rect right: " + sensor_rect.right);
+ Log.d(TAG, "sensor_rect bottom: " + sensor_rect.bottom);
+ Log.d(TAG, "left: " + left);
+ Log.d(TAG, "top: " + top);
+ Log.d(TAG, "right: " + right);
+ Log.d(TAG, "bottom: " + bottom);
/*Rect current_rect = previewBuilder.get(CaptureRequest.SCALER_CROP_REGION);
Log.d(TAG, "current_rect left: " + current_rect.left);
Log.d(TAG, "current_rect top: " + current_rect.top);
Log.d(TAG, "current_rect right: " + current_rect.right);
Log.d(TAG, "current_rect bottom: " + current_rect.bottom);*/
+ }
+ camera_settings.scalar_crop_region = new Rect(left, top, right, bottom);
+ camera_settings.setCropRegion(previewBuilder);
}
- camera_settings.scalar_crop_region = new Rect(left, top, right, bottom);
- camera_settings.setCropRegion(previewBuilder);
this.current_zoom_value = value;
try {
setRepeatingRequest();
@@ -4222,7 +4768,12 @@ public class CameraController2 extends CameraController {
e.printStackTrace();
}
}
-
+
+ @Override
+ public void resetZoom() {
+ setZoom(zoom_value_1x);
+ }
+
@Override
public int getExposureCompensation() {
if( previewBuilder.get(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION) == null )
@@ -4325,6 +4876,7 @@ public class CameraController2 extends CameraController {
public void setFocusValue(String focus_value) {
if( MyDebug.LOG )
Log.d(TAG, "setFocusValue: " + focus_value);
+ BLOCK_FOR_EXTENSIONS();
int focus_mode;
switch(focus_value) {
case "focus_mode_auto":
@@ -4558,6 +5110,9 @@ public class CameraController2 extends CameraController {
@Override
public void setAutoExposureLock(boolean enabled) {
+ if( enabled ) {
+ BLOCK_FOR_EXTENSIONS();
+ }
camera_settings.ae_lock = enabled;
camera_settings.setAutoExposureLock(previewBuilder);
try {
@@ -4582,6 +5137,9 @@ public class CameraController2 extends CameraController {
@Override
public void setAutoWhiteBalanceLock(boolean enabled) {
+ if( enabled ) {
+ BLOCK_FOR_EXTENSIONS();
+ }
camera_settings.wb_lock = enabled;
camera_settings.setAutoWhiteBalanceLock(previewBuilder);
try {
@@ -4721,6 +5279,9 @@ public class CameraController2 extends CameraController {
@Override
public boolean setFocusAndMeteringArea(List areas) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "setFocusAndMeteringArea");
+ BLOCK_FOR_EXTENSIONS();
Rect sensor_rect = getViewableRect();
if( MyDebug.LOG )
Log.d(TAG, "sensor_rect: " + sensor_rect.left + " , " + sensor_rect.top + " x " + sensor_rect.right + " , " + sensor_rect.bottom);
@@ -4766,6 +5327,9 @@ public class CameraController2 extends CameraController {
@Override
public void clearFocusAndMetering() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "clearFocusAndMetering");
+ BLOCK_FOR_EXTENSIONS();
Rect sensor_rect = getViewableRect();
boolean has_focus = false;
boolean has_metering = false;
@@ -4805,17 +5369,24 @@ public class CameraController2 extends CameraController {
e.printStackTrace();
}
}
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "af_regions: " + Arrays.toString(camera_settings.af_regions));
+ Log.d(TAG, "ae_regions: " + Arrays.toString(camera_settings.ae_regions));
+ }
}
@Override
public List getFocusAreas() {
if( characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF) == 0 )
return null;
+ if( camera_settings.af_regions == null ) {
+ // needed to fix failure on Android emulator in testTakePhotoContinuousNoTouch - can happen when CONTROL_MAX_REGIONS_AF > 0, but Camera only has 1 focus mode so Preview doesn't set focus areas
+ return null;
+ }
MeteringRectangle [] metering_rectangles = previewBuilder.get(CaptureRequest.CONTROL_AF_REGIONS);
if( metering_rectangles == null )
return null;
Rect sensor_rect = getViewableRect();
- camera_settings.af_regions[0] = new MeteringRectangle(0, 0, sensor_rect.width()-1, sensor_rect.height()-1, 0);
if( metering_rectangles.length == 1 && metering_rectangles[0].getRect().left == 0 && metering_rectangles[0].getRect().top == 0 && metering_rectangles[0].getRect().right == sensor_rect.width()-1 && metering_rectangles[0].getRect().bottom == sensor_rect.height()-1 ) {
// for compatibility with CameraController1
return null;
@@ -4831,6 +5402,10 @@ public class CameraController2 extends CameraController {
public List getMeteringAreas() {
if( characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) == 0 )
return null;
+ if( camera_settings.ae_regions == null ) {
+ // needed to fix failure on Android emulator in testTakePhotoContinuousNoTouch - can happen when CONTROL_MAX_REGIONS_AF > 0, but Camera only has 1 focus mode so Preview doesn't set focus areas
+ return null;
+ }
MeteringRectangle [] metering_rectangles = previewBuilder.get(CaptureRequest.CONTROL_AE_REGIONS);
if( metering_rectangles == null )
return null;
@@ -4850,6 +5425,8 @@ public class CameraController2 extends CameraController {
public boolean supportsAutoFocus() {
if( previewBuilder == null )
return false;
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION )
+ return false;
Integer focus_mode = previewBuilder.get(CaptureRequest.CONTROL_AF_MODE);
if( focus_mode == null )
return false;
@@ -4858,10 +5435,17 @@ public class CameraController2 extends CameraController {
return false;
}
+ @Override
+ public boolean supportsMetering() {
+ return characteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) > 0;
+ }
+
@Override
public boolean focusIsContinuous() {
if( previewBuilder == null )
return false;
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION )
+ return false;
Integer focus_mode = previewBuilder.get(CaptureRequest.CONTROL_AF_MODE);
if( focus_mode == null )
return false;
@@ -4874,6 +5458,8 @@ public class CameraController2 extends CameraController {
public boolean focusIsVideo() {
if( previewBuilder == null )
return false;
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION )
+ return false;
Integer focus_mode = previewBuilder.get(CaptureRequest.CONTROL_AF_MODE);
if( focus_mode == null )
return false;
@@ -4915,13 +5501,18 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG )
Log.d(TAG, "setRepeatingRequest");
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
}
try {
- if( is_video_high_speed && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ extensionSession.setRepeatingRequest(request, executor, previewExtensionCaptureCallback);
+ }
+ }
+ else if( is_video_high_speed && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
CameraConstrainedHighSpeedCaptureSession captureSessionHighSpeed = (CameraConstrainedHighSpeedCaptureSession) captureSession;
List mPreviewBuilderBurst = captureSessionHighSpeed.createHighSpeedRequestList(request);
captureSessionHighSpeed.setRepeatingBurst(mPreviewBuilderBurst, previewCaptureCallback, handler);
@@ -4945,15 +5536,19 @@ public class CameraController2 extends CameraController {
capture(previewBuilder.build());
}
+ /** Performs a "capture" - note that in practice this isn't used for taking photos, but for
+ * one-off captures for the preview stream (e.g., to trigger focus).
+ */
private void capture(CaptureRequest request) throws CameraAccessException {
if( MyDebug.LOG )
Log.d(TAG, "capture");
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
}
+ BLOCK_FOR_EXTENSIONS(); // not yet supported for extension sessions
captureSession.capture(request, previewCaptureCallback, handler);
}
}
@@ -4970,14 +5565,13 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "camera: " + camera);
try {
previewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
- previewIsVideoMode = false;
previewBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_PREVIEW);
+ previewIsVideoMode = false;
camera_settings.setupBuilder(previewBuilder, false);
if( MyDebug.LOG )
Log.d(TAG, "successfully created preview request");
}
catch(CameraAccessException e) {
- //captureSession = null;
if( MyDebug.LOG ) {
Log.e(TAG, "failed to create capture request");
Log.e(TAG, "reason: " + e.getReason());
@@ -5026,13 +5620,42 @@ public class CameraController2 extends CameraController {
return;
}
- synchronized( background_camera_lock ) {
- if( captureSession != null ) {
- if( MyDebug.LOG )
- Log.d(TAG, "close old capture session");
- captureSession.close();
- captureSession = null;
- //pending_request_when_ready = null;
+ closeCaptureSession();
+
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // check parameters are compatible with extension sessions
+ // we check here rather than when setting those parameters, to avoid problems with
+ // ordering (e.g., the caller sets those parameters, and then switches to an
+ // extension session)
+ if( want_video_high_speed ) {
+ throw new RuntimeException("want_video_high_speed not supported for extension session");
+ }
+ else if( burst_type != BurstType.BURSTTYPE_NONE ) {
+ throw new RuntimeException("burst_type not supported for extension session");
+ }
+ else if( want_raw ) {
+ throw new RuntimeException("want_raw not supported for extension session");
+ }
+ else if( camera_settings.has_iso ) {
+ throw new RuntimeException("has_iso not supported for extension session");
+ }
+ else if( camera_settings.ae_target_fps_range != null ) {
+ throw new RuntimeException("ae_target_fps_range not supported for extension session");
+ }
+ else if( camera_settings.sensor_frame_duration > 0 ) {
+ throw new RuntimeException("sensor_frame_duration not supported for extension session");
+ }
+ else if( camera_settings.ae_lock ) {
+ throw new RuntimeException("ae_lock not supported for extension session");
+ }
+ else if( camera_settings.wb_lock ) {
+ throw new RuntimeException("wb_lock not supported for extension session");
+ }
+ else if( camera_settings.has_face_detect_mode ) {
+ throw new RuntimeException("has_face_detect_mode not supported for extension session");
+ }
+ else if( face_detection_listener != null ) {
+ throw new RuntimeException("face_detection_listener not supported for extension session");
}
}
@@ -5094,13 +5717,9 @@ public class CameraController2 extends CameraController {
}
class MyStateCallback extends CameraCaptureSession.StateCallback {
- private boolean callback_done; // must sychronize on this and notifyAll when setting to true
- @Override
- public void onConfigured(@NonNull CameraCaptureSession session) {
- if( MyDebug.LOG ) {
- Log.d(TAG, "onConfigured: " + session);
- Log.d(TAG, "captureSession was: " + captureSession);
- }
+ private boolean callback_done; // must synchronize on this and notifyAll when setting to true
+
+ void onConfigured(@NonNull CameraCaptureSession session, @NonNull CameraExtensionSession eSession) {
if( camera == null ) {
if( MyDebug.LOG ) {
Log.d(TAG, "camera is closed");
@@ -5113,11 +5732,8 @@ public class CameraController2 extends CameraController {
}
synchronized( background_camera_lock ) {
captureSession = session;
- Surface surface = getPreviewSurface();
- if( MyDebug.LOG ) {
- Log.d(TAG, "add surface to previewBuilder: " + surface);
- }
- previewBuilder.addTarget(surface);
+ extensionSession = eSession;
+ previewBuilder.addTarget(surface_texture);
if( video_recorder != null ) {
if( MyDebug.LOG ) {
Log.d(TAG, "add video recorder surface to previewBuilder: " + video_recorder_surface);
@@ -5137,6 +5753,7 @@ public class CameraController2 extends CameraController {
// we indicate that we failed to start the preview by setting captureSession back to null
// this will cause a CameraControllerException to be thrown below
captureSession = null;
+ extensionSession = null;
}
}
synchronized( background_camera_lock ) {
@@ -5146,11 +5763,14 @@ public class CameraController2 extends CameraController {
}
@Override
- public void onConfigureFailed(@NonNull CameraCaptureSession session) {
+ public void onConfigured(@NonNull CameraCaptureSession session) {
if( MyDebug.LOG ) {
- Log.d(TAG, "onConfigureFailed: " + session);
- Log.d(TAG, "captureSession was: " + captureSession);
+ Log.d(TAG, "onConfigured: " + session);
}
+ onConfigured(session, null);
+ }
+
+ void onConfigureFailed() {
synchronized( background_camera_lock ) {
callback_done = true;
background_camera_lock.notifyAll();
@@ -5158,6 +5778,14 @@ public class CameraController2 extends CameraController {
// don't throw CameraControllerException here, as won't be caught - instead we throw CameraControllerException below
}
+ @Override
+ public void onConfigureFailed(@NonNull CameraCaptureSession session) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onConfigureFailed: " + session);
+ }
+ onConfigureFailed();
+ }
+
/*@Override
public void onReady(CameraCaptureSession session) {
if( MyDebug.LOG )
@@ -5216,6 +5844,7 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG ) {
Log.d(TAG, "texture: " + texture);
Log.d(TAG, "preview_surface: " + preview_surface);
+ Log.d(TAG, "handler: " + handler);
}
}
if( MyDebug.LOG ) {
@@ -5234,7 +5863,49 @@ public class CameraController2 extends CameraController {
}
}
}
- if( video_recorder != null && want_video_high_speed && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ //int extension = CameraExtensionCharacteristics.EXTENSION_AUTOMATIC;
+ //int extension = CameraExtensionCharacteristics.EXTENSION_BOKEH;
+ int extension = camera_extension;
+ List outputs = new ArrayList<>();
+ for(Surface surface : surfaces) {
+ outputs.add(new OutputConfiguration(surface));
+ }
+ ExtensionSessionConfiguration extensionConfiguration = new ExtensionSessionConfiguration(
+ extension,
+ outputs,
+ executor,
+ new CameraExtensionSession.StateCallback() {
+ @Override
+ public void onConfigured(@NonNull CameraExtensionSession session) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onConfigured: " + session);
+ }
+ myStateCallback.onConfigured(null, session);
+ }
+
+ @Override
+ public void onConfigureFailed(@NonNull CameraExtensionSession session) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onConfigureFailed: " + session);
+ }
+ myStateCallback.onConfigureFailed();
+ }
+
+ @Override
+ public void onClosed(@NonNull CameraExtensionSession session) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onClosed: " + session);
+ }
+ }
+ }
+ );
+ camera.createExtensionSession(extensionConfiguration);
+ }
+ is_video_high_speed = false;
+ }
+ else if( video_recorder != null && want_video_high_speed && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
//if( want_video_high_speed && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
camera.createConstrainedHighSpeedCaptureSession(surfaces,
myStateCallback,
@@ -5278,10 +5949,16 @@ public class CameraController2 extends CameraController {
}
}
if( MyDebug.LOG ) {
- Log.d(TAG, "created captureSession: " + captureSession);
+ if( captureSession != null )
+ Log.d(TAG, "created captureSession: " + captureSession);
+ if( extensionSession != null )
+ Log.d(TAG, "created extensionSession: " + extensionSession);
+ }
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ resetCaptureResultInfo(); // important as extension modes don't receive capture result info
}
synchronized( background_camera_lock ) {
- if( captureSession == null ) {
+ if( !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.e(TAG, "failed to create capture session");
throw new CameraControllerException();
@@ -5313,8 +5990,22 @@ public class CameraController2 extends CameraController {
public void startPreview() throws CameraControllerException {
if( MyDebug.LOG )
Log.d(TAG, "startPreview");
+
+ if( !camera_settings.has_af_mode && initial_focus_mode != null && sessionType != SessionType.SESSIONTYPE_EXTENSION ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "user didn't specify focus, so set to: " + initial_focus_mode);
+ // If the caller hasn't set a focus mode, but focus modes are supported, it's still better to explicitly set one rather than leaving to the
+ // builder's default - e.g., problem on Android emulator with LIMITED camera where it only supported infinity focus (CONTROL_AF_MODE_OFF), but
+ // the preview builder defaults to CONTROL_AF_MODE_CONTINUOUS_PICTURE! This meant we froze when trying to take a photo, because we thought
+ // we were in continuous picture mode and so waited in state STATE_WAITING_AUTOFOCUS, but the focus never occurred.
+ // Ideally the caller to CameraController2 (Preview) should always explicitly set a focus mode if at least 1 focus mode is supported. At the
+ // time of writing, Preview only sets a focus if at least 2 focus modes are supported. But even if we fix that in future, still good to have
+ // well defined behaviour at the CameraController level.
+ setFocusValue(initial_focus_mode);
+ }
+
synchronized( background_camera_lock ) {
- if( captureSession != null ) {
+ if( hasCaptureSession() ) {
try {
setRepeatingRequest();
}
@@ -5339,7 +6030,7 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG )
Log.d(TAG, "stopPreview: " + this);
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -5348,7 +6039,14 @@ public class CameraController2 extends CameraController {
//pending_request_when_ready = null;
try {
- captureSession.stopRepeating();
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ extensionSession.stopRepeating();
+ }
+ }
+ else {
+ captureSession.stopRepeating();
+ }
}
catch(IllegalStateException e) {
if( MyDebug.LOG )
@@ -5358,10 +6056,7 @@ public class CameraController2 extends CameraController {
// we still call close() below, as it has no effect if captureSession is already closed
}
// although stopRepeating() alone will pause the preview, seems better to close captureSession altogether - this allows the app to make changes such as changing the picture size
- if( MyDebug.LOG )
- Log.d(TAG, "close capture session");
- captureSession.close();
- captureSession = null;
+ closeCaptureSession();
}
catch(CameraAccessException e) {
if( MyDebug.LOG ) {
@@ -5386,6 +6081,7 @@ public class CameraController2 extends CameraController {
public boolean startFaceDetection() {
if( MyDebug.LOG )
Log.d(TAG, "startFaceDetection");
+ BLOCK_FOR_EXTENSIONS();
if( previewBuilder.get(CaptureRequest.STATISTICS_FACE_DETECT_MODE) != null && previewBuilder.get(CaptureRequest.STATISTICS_FACE_DETECT_MODE) != CaptureRequest.STATISTICS_FACE_DETECT_MODE_OFF ) {
if( MyDebug.LOG )
Log.d(TAG, "face detection already enabled");
@@ -5426,6 +6122,9 @@ public class CameraController2 extends CameraController {
@Override
public void setFaceDetectionListener(final FaceDetectionListener listener) {
+ if( listener != null ) {
+ BLOCK_FOR_EXTENSIONS();
+ }
this.face_detection_listener = listener;
this.last_faces_detected = -1;
}
@@ -5435,17 +6134,19 @@ public class CameraController2 extends CameraController {
If do_af_trigger_for_continuous is true, we set CONTROL_AF_TRIGGER_START, and wait for
CONTROL_AF_STATE_FOCUSED_LOCKED or CONTROL_AF_STATE_NOT_FOCUSED_LOCKED, similar to other focus
methods.
- do_af_trigger_for_continuous==true has advantages:
+ do_af_trigger_for_continuous==true used to have advantages:
- On Nexus 6 for flash auto, it means ae state is set to FLASH_REQUIRED if it is required
when it comes to taking the photo. If do_af_trigger_for_continuous==false, sometimes
it's set to CONTROL_AE_STATE_CONVERGED even for dark scenes, so we think we can skip
the precapture, causing photos to come out dark (or we can force always doing precapture,
but that makes things slower when flash isn't needed)
+ Update: this now seems hard to reproduce.
- On OnePlus 3T, with do_af_trigger_for_continuous==false photos come out with blue tinge
if the scene is not dark (but still dark enough that you'd want flash).
do_af_trigger_for_continuous==true fixes this for cases where the flash fires for autofocus.
Note that the problem is still not fixed for flash on where the scene is bright enough to
not need flash (and so we don't fire flash for autofocus).
+ Update: now fixed by setting TEMPLATE_PREVIEW for the precaptureBuilder.
do_af_trigger_for_continuous==true has disadvantage:
- On both Nexus 6 and OnePlus 3T, taking photos with flash is longer, as we have flash firing
for autofocus and precapture. Though note this is the case with autofocus mode anyway.
@@ -5453,8 +6154,7 @@ public class CameraController2 extends CameraController {
af trigger for fake flash mode can sometimes mean flash fires for too long and we get a worse
result).
*/
- //private final static boolean do_af_trigger_for_continuous = false;
- private final static boolean do_af_trigger_for_continuous = true;
+ private final static boolean do_af_trigger_for_continuous = false;
@Override
public void autoFocus(final AutoFocusCallback cb, boolean capture_follows_autofocus_hint) {
@@ -5465,7 +6165,7 @@ public class CameraController2 extends CameraController {
AutoFocusCallback push_autofocus_cb = null;
synchronized( background_camera_lock ) {
fake_precapture_torch_focus_performed = false;
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
// should call the callback, so the application isn't left waiting (e.g., when we autofocus before trying to take a photo)
@@ -5473,6 +6173,8 @@ public class CameraController2 extends CameraController {
return;
}
Integer focus_mode = previewBuilder.get(CaptureRequest.CONTROL_AF_MODE);
+ if( MyDebug.LOG )
+ Log.d(TAG, "focus mode: " + (focus_mode == null ? "null" : focus_mode));
if( focus_mode == null ) {
// we preserve the old Camera API where calling autoFocus() on a device without autofocus immediately calls the callback
// (unclear if Open Camera needs this, but just to be safe and consistent between camera APIs)
@@ -5481,6 +6183,12 @@ public class CameraController2 extends CameraController {
cb.onAutoFocus(true);
return;
}
+ else if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "no auto focus for extensions");
+ cb.onAutoFocus(true);
+ return;
+ }
else if( (!do_af_trigger_for_continuous || use_fake_precapture_mode) && focus_mode == CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE ) {
// See note above for do_af_trigger_for_continuous
if( MyDebug.LOG )
@@ -5599,6 +6307,7 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "setCaptureFollowAutofocusHint");
Log.d(TAG, "capture_follows_autofocus_hint? " + capture_follows_autofocus_hint);
}
+ BLOCK_FOR_EXTENSIONS();
synchronized( background_camera_lock ) {
this.capture_follows_autofocus_hint = capture_follows_autofocus_hint;
}
@@ -5609,7 +6318,7 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG )
Log.d(TAG, "cancelAutoFocus");
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -5621,6 +6330,12 @@ public class CameraController2 extends CameraController {
return;
}
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "session type extension");
+ return;
+ }
+
previewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
// Camera2Basic does a capture then sets a repeating request - do the same here just to be safe
try {
@@ -5657,6 +6372,9 @@ public class CameraController2 extends CameraController {
public void setContinuousFocusMoveCallback(ContinuousFocusMoveCallback cb) {
if( MyDebug.LOG )
Log.d(TAG, "setContinuousFocusMoveCallback");
+ if( cb != null ) {
+ BLOCK_FOR_EXTENSIONS();
+ }
this.continuous_focus_move_callback = cb;
}
@@ -5679,14 +6397,11 @@ public class CameraController2 extends CameraController {
/** Sets up a builder to have manual exposure time, if supported. The exposure time will be
* clamped to the allowed values, and manual ISO will also be set based on the current ISO value.
*/
- private void setManualExposureTime(CaptureRequest.Builder stillBuilder, long exposure_time) {
+ private void setManualExposureTime(CaptureRequest.Builder stillBuilder, long exposure_time, boolean set_iso, int new_iso) {
if( MyDebug.LOG )
Log.d(TAG, "setManualExposureTime: " + exposure_time);
- Range exposure_time_range = characteristics.get(CameraCharacteristics.SENSOR_INFO_EXPOSURE_TIME_RANGE); // may be null on some devices
Range iso_range = characteristics.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE); // may be null on some devices
- if( exposure_time_range != null && iso_range != null ) {
- long min_exposure_time = exposure_time_range.getLower();
- long max_exposure_time = exposure_time_range.getUpper();
+ if( this.supports_exposure_time && iso_range != null ) {
if( exposure_time < min_exposure_time )
exposure_time = min_exposure_time;
if( exposure_time > max_exposure_time )
@@ -5698,12 +6413,17 @@ public class CameraController2 extends CameraController {
{
// set ISO
int iso = 800;
- if( capture_result_has_iso )
+ if( set_iso )
+ iso = new_iso;
+ else if( capture_result_has_iso )
iso = capture_result_iso;
// see https://sourceforge.net/p/opencamera/tickets/321/ - some devices may have auto ISO that's
// outside of the allowed manual iso range!
iso = Math.max(iso, iso_range.getLower());
iso = Math.min(iso, iso_range.getUpper());
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "iso: " + iso);
+ }
stillBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, iso );
}
if( capture_result_has_frame_duration )
@@ -5745,7 +6465,7 @@ public class CameraController2 extends CameraController {
ErrorCallback push_take_picture_error_cb = null;
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -5753,7 +6473,7 @@ public class CameraController2 extends CameraController {
try {
if( MyDebug.LOG ) {
if( imageReaderRaw != null ) {
- Log.d(TAG, "imageReaderRaw: " + imageReaderRaw.toString());
+ Log.d(TAG, "imageReaderRaw: " + imageReaderRaw);
Log.d(TAG, "imageReaderRaw surface: " + imageReaderRaw.getSurface().toString());
}
else {
@@ -5761,8 +6481,9 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "imageReader surface: " + imageReader.getSurface().toString());
}
}
- stillBuilder = camera.createCaptureRequest(previewIsVideoMode ? CameraDevice.TEMPLATE_VIDEO_SNAPSHOT : CameraDevice.TEMPLATE_STILL_CAPTURE);
- stillBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_STILL_CAPTURE);
+ // important to use TEMPLATE_MANUAL for manual exposure: this fixes bug on Pixel 6 Pro where manual exposure is ignored when longer than the
+ // preview exposure time (oddly Galaxy S10e has the same bug since Android 11, but that isn't fixed with using TEMPLATE_MANUAL)
+ stillBuilder = camera.createCaptureRequest(previewIsVideoMode ? CameraDevice.TEMPLATE_VIDEO_SNAPSHOT : camera_settings.has_iso ? CameraDevice.TEMPLATE_MANUAL : CameraDevice.TEMPLATE_STILL_CAPTURE);
stillBuilder.setTag(new RequestTagObject(RequestTagType.CAPTURE));
camera_settings.setupBuilder(stillBuilder, true);
if( use_fake_precapture_mode && fake_precapture_torch_performed ) {
@@ -5786,14 +6507,15 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "exposure_time_scale: " + exposure_time_scale);
}
modified_from_camera_settings = true;
- setManualExposureTime(stillBuilder, exposure_time);
+ setManualExposureTime(stillBuilder, exposure_time, false, 0);
}
}
//stillBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
//stillBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
- if( Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.O ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.O && sessionType != SessionType.SESSIONTYPE_EXTENSION ) {
// unclear why we wouldn't want to request ZSL
// this is also required to enable HDR+ on Google Pixel devices when using Camera2: https://opensource.google.com/projects/pixelvisualcorecamera
+ // but don't set for extension sessions (in theory it should be ignored, but just in case)
stillBuilder.set(CaptureRequest.CONTROL_ENABLE_ZSL, true);
if( MyDebug.LOG ) {
Boolean zsl = stillBuilder.get(CaptureRequest.CONTROL_ENABLE_ZSL);
@@ -5815,7 +6537,8 @@ public class CameraController2 extends CameraController {
// need to stop preview before capture (as done in Camera2Basic; otherwise we get bugs such as flash remaining on after taking a photo with flash)
// but don't do this in video mode - if we're taking photo snapshots while video recording, we don't want to pause video!
// update: bug with flash may have been device specific (things are fine with Nokia 8)
- captureSession.stopRepeating();
+ if( sessionType != SessionType.SESSIONTYPE_EXTENSION )
+ captureSession.stopRepeating();
}
}
catch(CameraAccessException e) {
@@ -5853,7 +6576,7 @@ public class CameraController2 extends CameraController {
if( ok ) {
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -5879,9 +6602,17 @@ public class CameraController2 extends CameraController {
if( MyDebug.LOG )
Log.d(TAG, "capture with stillBuilder");
//pending_request_when_ready = stillBuilder.build();
- captureSession.capture(stillBuilder.build(), previewCaptureCallback, handler);
- //captureSession.capture(stillBuilder.build(), new CameraCaptureSession.CaptureCallback() {
- //}, handler);
+
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ) {
+ extensionSession.capture(stillBuilder.build(), executor, previewExtensionCaptureCallback);
+ }
+ }
+ else {
+ captureSession.capture(stillBuilder.build(), previewCaptureCallback, handler);
+ //captureSession.capture(stillBuilder.build(), new CameraCaptureSession.CaptureCallback() {
+ //}, handler);
+ }
playSound(MediaActionSound.SHUTTER_CLICK); // play shutter sound asap, otherwise user has the illusion of being slow to take photos
}
catch(CameraAccessException e) {
@@ -5986,13 +6717,14 @@ public class CameraController2 extends CameraController {
if( burst_type != BurstType.BURSTTYPE_EXPO && burst_type != BurstType.BURSTTYPE_FOCUS ) {
Log.e(TAG, "takePictureBurstBracketing called but unexpected burst_type: " + burst_type);
}
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
List requests = new ArrayList<>();
boolean ok = true;
ErrorCallback push_take_picture_error_cb = null;
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -6002,11 +6734,19 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "imageReader: " + imageReader.toString());
Log.d(TAG, "imageReader surface: " + imageReader.getSurface().toString());
}
+ int n_dummy_requests = 0;
- CaptureRequest.Builder stillBuilder = camera.createCaptureRequest(previewIsVideoMode ? CameraDevice.TEMPLATE_VIDEO_SNAPSHOT : CameraDevice.TEMPLATE_STILL_CAPTURE);
- stillBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_STILL_CAPTURE);
+ CaptureRequest.Builder stillBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_MANUAL);
+ // Needs to be TEMPLATE_MANUAL! Otherwise first image in burst may come out incorrectly (on Pixel 6 Pro,
+ // the first image incorrectly had HDR+ applied, which we don't want here).
// n.b., don't set RequestTagType.CAPTURE here - we only do it for the last of the burst captures (see below)
camera_settings.setupBuilder(stillBuilder, true);
+
+ if( MyDebug.LOG && Build.VERSION.SDK_INT >= Build.VERSION_CODES.O ) {
+ Boolean zsl = stillBuilder.get(CaptureRequest.CONTROL_ENABLE_ZSL);
+ Log.d(TAG, "CONTROL_ENABLE_ZSL: " + (zsl==null ? "null" : zsl));
+ }
+
clearPending();
// shouldn't add preview surface as a target - see note in takePictureAfterPrecapture()
// but also, adding the preview surface causes the dark/light exposures to be visible, which we don't want
@@ -6070,14 +6810,7 @@ public class CameraController2 extends CameraController {
base_exposure_time = capture_result_exposure_time;
int n_half_images = expo_bracketing_n_images/2;
- long min_exposure_time = base_exposure_time;
- long max_exposure_time = base_exposure_time;
final double scale = Math.pow(2.0, expo_bracketing_stops/(double)n_half_images);
- Range exposure_time_range = characteristics.get(CameraCharacteristics.SENSOR_INFO_EXPOSURE_TIME_RANGE); // may be null on some devices
- if( exposure_time_range != null ) {
- min_exposure_time = exposure_time_range.getLower();
- max_exposure_time = exposure_time_range.getUpper();
- }
if( MyDebug.LOG ) {
Log.d(TAG, "taking expo bracketing with n_images: " + expo_bracketing_n_images);
@@ -6088,10 +6821,25 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "Max exposure time: " + max_exposure_time);
}
+ if( dummy_capture_hack && use_expo_fast_burst ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "add dummy capture");
+ // dummy_capture_hack only supported for use_expo_fast_burst==true -
+ // supporting for use_expo_fast_burst==false would complicate the code, and
+ // these are only special case hacks anyway
+ stillBuilder.setTag(null);
+ requests.add( stillBuilder.build() );
+ n_dummy_requests++;
+ if( onImageAvailableListener != null )
+ onImageAvailableListener.skip_next_image = true;
+ if( onRawImageAvailableListener != null )
+ onRawImageAvailableListener.skip_next_image = true;
+ }
+
// darker images
for(int i=0;i= ISO_FOR_DARK ) {
+ if( capture_result_has_iso && capture_result_has_exposure_time ) {
+ if( HDRProcessor.sceneIsLowLight(capture_result_iso, capture_result_exposure_time) ) {
if( MyDebug.LOG )
Log.d(TAG, "optimise for dark scene");
n_burst = noise_reduction_low_light ? N_IMAGES_NR_DARK_LOW_LIGHT : N_IMAGES_NR_DARK;
- boolean is_oneplus = Build.MANUFACTURER.toLowerCase(Locale.US).contains("oneplus");
// OnePlus 3T at least has bug where manual ISO can't be set to above 800, so dark images end up too dark -
// so no point enabling this code, which is meant to brighten the scene, not make it darker!
if( !camera_settings.has_iso && !is_oneplus ) {
long exposure_time = noise_reduction_low_light ? 1000000000L/3 : 1000000000L/10;
- if( !capture_result_has_exposure_time || capture_result_exposure_time < exposure_time ) {
+ if( capture_result_exposure_time < exposure_time ) {
if( MyDebug.LOG )
Log.d(TAG, "also set long exposure time");
modified_from_camera_settings = true;
- setManualExposureTime(stillBuilder, exposure_time);
+
+ boolean set_new_iso = false;
+ int new_iso = 0;
+ {
+ // ISO*exposure should be a constant
+ set_new_iso = true;
+ new_iso = (int)((capture_result_iso * capture_result_exposure_time)/exposure_time);
+ // but don't make ISO too low
+ new_iso = Math.max(new_iso, capture_result_iso/2);
+ new_iso = Math.max(new_iso, 1100);
+ if( MyDebug.LOG )
+ Log.d(TAG, "... and set iso to " + new_iso);
+ }
+
+ setManualExposureTime(stillBuilder, exposure_time, set_new_iso, new_iso);
}
else {
if( MyDebug.LOG )
@@ -6448,7 +7209,9 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "optimise for bright scene");
//n_burst = 2;
n_burst = 3;
- if( !camera_settings.has_iso ) {
+ // At least some Samsung devices (e.g., Galaxy S10e on Android 11) give better results in auto mode
+ // than manual mode, so we're better off staying in auto mode.
+ if( !camera_settings.has_iso && !is_samsung ) {
double exposure_time_scale = getScaleForExposureTime(exposure_time, fixed_exposure_time, scaled_exposure_time, full_exposure_time_scale);
exposure_time *= exposure_time_scale;
if( MyDebug.LOG ) {
@@ -6456,7 +7219,7 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "exposure_time_scale: " + exposure_time_scale);
}
modified_from_camera_settings = true;
- setManualExposureTime(stillBuilder, exposure_time);
+ setManualExposureTime(stillBuilder, exposure_time, false, 0);
}
}
}
@@ -6508,7 +7271,7 @@ public class CameraController2 extends CameraController {
if( ok ) {
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -6593,7 +7356,7 @@ public class CameraController2 extends CameraController {
ErrorCallback push_take_picture_error_cb = null;
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
return;
@@ -6659,6 +7422,9 @@ public class CameraController2 extends CameraController {
private void runPrecapture() {
if( MyDebug.LOG )
Log.d(TAG, "runPrecapture");
+
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
+
long debug_time = 0;
if( MyDebug.LOG ) {
debug_time = System.currentTimeMillis();
@@ -6675,9 +7441,12 @@ public class CameraController2 extends CameraController {
Log.e(TAG, "shouldn't be doing precapture for burst - should be using fake precapture!");
}
try {
- // use a separate builder for precapture - otherwise have problem that if we take photo with flash auto/on of dark scene, then point to a bright scene, the autoexposure isn't running until we autofocus again
- final CaptureRequest.Builder precaptureBuilder = camera.createCaptureRequest(previewIsVideoMode ? CameraDevice.TEMPLATE_VIDEO_SNAPSHOT : CameraDevice.TEMPLATE_STILL_CAPTURE);
- precaptureBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_STILL_CAPTURE);
+ // Use a separate builder for precapture - otherwise have problem that if we take photo with flash auto/on of dark scene, then point to a bright scene, the autoexposure isn't running until we autofocus again.
+ // Important that this is TEMPLATE_PREVIEW not TEMPLATE_STILL_CAPTURE, otherwise we have various problems with flash:
+ // * Flash won't fire on Galaxy devices.
+ // * End up with blue tinge on OnePlus 3T.
+ // * Flash auto produces blue tinge and leaves torch on for Pixel 6 Pro.
+ final CaptureRequest.Builder precaptureBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
camera_settings.setupBuilder(precaptureBuilder, false);
precaptureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_IDLE);
@@ -6724,6 +7493,9 @@ public class CameraController2 extends CameraController {
private void runFakePrecapture() {
if( MyDebug.LOG )
Log.d(TAG, "runFakePrecapture");
+
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
+
long debug_time = 0;
if( MyDebug.LOG ) {
debug_time = System.currentTimeMillis();
@@ -6877,7 +7649,7 @@ public class CameraController2 extends CameraController {
boolean call_runPrecapture = false;
synchronized( background_camera_lock ) {
- if( camera == null || captureSession == null ) {
+ if( camera == null || !hasCaptureSession() ) {
if( MyDebug.LOG )
Log.d(TAG, "no camera or capture session");
error.onError();
@@ -6889,7 +7661,7 @@ public class CameraController2 extends CameraController {
this.done_all_captures = false;
this.take_picture_error_cb = error;
this.fake_precapture_torch_performed = false; // just in case still on?
- if( !ready_for_capture ) {
+ if( sessionType == SessionType.SESSIONTYPE_NORMAL && !ready_for_capture ) {
if( MyDebug.LOG )
Log.e(TAG, "takePicture: not ready for capture!");
//throw new RuntimeException(); // debugging
@@ -6900,8 +7672,12 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "current flash value: " + camera_settings.flash_value);
Log.d(TAG, "use_fake_precapture_mode: " + use_fake_precapture_mode);
}
- // Don't need precapture if flash off or torch
- if( camera_settings.flash_value.equals("flash_off") || camera_settings.flash_value.equals("flash_torch") || camera_settings.flash_value.equals("flash_frontscreen_torch") ) {
+ if( sessionType == SessionType.SESSIONTYPE_EXTENSION ) {
+ // precapture not supported for extensions
+ call_takePictureAfterPrecapture = true;
+ }
+ else if( camera_settings.flash_value.equals("flash_off") || camera_settings.flash_value.equals("flash_torch") || camera_settings.flash_value.equals("flash_frontscreen_torch") ) {
+ // Don't need precapture if flash off or torch
call_takePictureAfterPrecapture = true;
}
else if( use_fake_precapture_mode ) {
@@ -7016,6 +7792,7 @@ public class CameraController2 extends CameraController {
@Override
public void initVideoRecorderPrePrepare(MediaRecorder video_recorder) {
// if we change where we play the START_VIDEO_RECORDING sound, make sure it can't be heard in resultant video
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
playSound(MediaActionSound.START_VIDEO_RECORDING);
}
@@ -7027,6 +7804,7 @@ public class CameraController2 extends CameraController {
Log.e(TAG, "no camera");
throw new CameraControllerException();
}
+ BLOCK_FOR_EXTENSIONS(); // not supported for extension sessions
try {
if( MyDebug.LOG )
Log.d(TAG, "obtain video_recorder surface");
@@ -7095,7 +7873,7 @@ public class CameraController2 extends CameraController {
@Override
public int captureResultWhiteBalanceTemperature() {
// for performance reasons, we don't convert from rggb to temperature in every frame, rather only when requested
- return convertRggbToTemperature(capture_result_white_balance_rggb);
+ return convertRggbVectorToTemperature(capture_result_white_balance_rggb);
}
@Override
@@ -7155,7 +7933,85 @@ public class CameraController2 extends CameraController {
}
*/
- private final CameraCaptureSession.CaptureCallback previewCaptureCallback = new CameraCaptureSession.CaptureCallback() {
+ private final CameraExtensionSession.ExtensionCaptureCallback previewExtensionCaptureCallback;
+
+ @RequiresApi(api = Build.VERSION_CODES.S)
+ private class MyExtensionCaptureCallback extends CameraExtensionSession.ExtensionCaptureCallback {
+
+ @Override
+ public void onCaptureStarted(@NonNull CameraExtensionSession session,
+ @NonNull CaptureRequest request, long timestamp) {
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "onCaptureStarted");*/
+ if( MyDebug.LOG ) {
+ if( previewCaptureCallback.getRequestTagType(request) == RequestTagType.CAPTURE ) {
+ Log.d(TAG, "onCaptureStarted: capture");
+ }
+ else if( previewCaptureCallback.getRequestTagType(request) == RequestTagType.CAPTURE_BURST_IN_PROGRESS ) {
+ Log.d(TAG, "onCaptureStarted: capture burst in progress");
+ }
+ }
+
+ // for previewCaptureCallback, we set has_received_frame in onCaptureCompleted(), but
+ // that method doesn't exist for ExtensionCaptureCallback, and the other methods such as
+ // onCaptureSequenceCompleted aren't called for the preview captures
+ if( !has_received_frame ) {
+ has_received_frame = true;
+ if( MyDebug.LOG )
+ Log.d(TAG, "has_received_frame now set to true");
+ }
+
+ super.onCaptureStarted(session, request, timestamp);
+ }
+
+ @Override
+ public void onCaptureProcessStarted(@NonNull CameraExtensionSession session,
+ @NonNull CaptureRequest request) {
+ super.onCaptureProcessStarted(session, request);
+ }
+
+ @Override
+ public void onCaptureFailed(@NonNull CameraExtensionSession session,
+ @NonNull CaptureRequest request) {
+ if( MyDebug.LOG ) {
+ Log.e(TAG, "onCaptureFailed");
+ }
+ super.onCaptureFailed(session, request);
+ }
+
+ @Override
+ public void onCaptureSequenceCompleted(@NonNull CameraExtensionSession session,
+ int sequenceId) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onCaptureSequenceCompleted");
+ Log.d(TAG, "sequenceId: " + sequenceId);
+ }
+
+ // since we don't receive the request, we can't check for a request tag type of
+ // RequestTagType.CAPTURE - but this method should only be called for photo captures
+ // anyway
+ test_capture_results++;
+ modified_from_camera_settings = false;
+
+ previewCaptureCallback.callCheckImagesCompleted();
+
+ super.onCaptureSequenceCompleted(session, sequenceId);
+ }
+
+ @Override
+ public void onCaptureSequenceAborted(@NonNull CameraExtensionSession session,
+ int sequenceId) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "onCaptureSequenceAborted");
+ Log.d(TAG, "sequenceId: " + sequenceId);
+ }
+ super.onCaptureSequenceAborted(session, sequenceId);
+ }
+ }
+
+ private final MyCaptureCallback previewCaptureCallback = new MyCaptureCallback();
+
+ private class MyCaptureCallback extends CameraCaptureSession.CaptureCallback {
private long last_process_frame_number = 0;
private int last_af_state = -1;
@@ -7212,6 +8068,11 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "frameNumber: " + frameNumber);
Log.d(TAG, "exposure time: " + request.get(CaptureRequest.SENSOR_EXPOSURE_TIME));
}
+ else if( getRequestTagType(request) == RequestTagType.CAPTURE_BURST_IN_PROGRESS ) {
+ Log.d(TAG, "onCaptureStarted: capture burst in progress");
+ Log.d(TAG, "frameNumber: " + frameNumber);
+ Log.d(TAG, "exposure time: " + request.get(CaptureRequest.SENSOR_EXPOSURE_TIME));
+ }
}
// n.b., we don't play the shutter sound here for RequestTagType.CAPTURE, as it typically sounds "too late"
// (if ever we changed this, would also need to fix for burst, where we only set the RequestTagType.CAPTURE for the last image)
@@ -7242,6 +8103,13 @@ public class CameraController2 extends CameraController {
Log.d(TAG, "exposure time: " + request.get(CaptureRequest.SENSOR_EXPOSURE_TIME));
Log.d(TAG, "frame duration: " + request.get(CaptureRequest.SENSOR_FRAME_DURATION));
}
+ else if( getRequestTagType(request) == RequestTagType.CAPTURE_BURST_IN_PROGRESS ) {
+ Log.d(TAG, "onCaptureCompleted: capture burst in progress");
+ Log.d(TAG, "sequenceId: " + result.getSequenceId());
+ Log.d(TAG, "frameNumber: " + result.getFrameNumber());
+ Log.d(TAG, "exposure time: " + request.get(CaptureRequest.SENSOR_EXPOSURE_TIME));
+ Log.d(TAG, "frame duration: " + request.get(CaptureRequest.SENSOR_FRAME_DURATION));
+ }
}
process(request, result);
processCompleted(request, result);
@@ -7740,7 +8608,11 @@ public class CameraController2 extends CameraController {
// (This affects the exposure time shown on on-screen preview - whilst showing the preview exposure time
// isn't necessarily wrong, it tended to confuse people, thinking that manual exposure time wasn't working
// when set above max_preview_exposure_time_c.)
- if( camera_settings.has_iso && camera_settings.exposure_time > max_preview_exposure_time_c )
+ // Update: but on some devices (e.g., Galaxy S10e) the reported exposure time can become inaccurate when
+ // we set longer preview exposure times (fine at 1/15s, 1/10s, but wrong at 0.2s and 0.3s), possibly this is
+ // by design if the preview along supports certain rates(?), but best to fall back to the requested exposure
+ // time in manual mode if requested exposure is longer than 1/12s OR the max_preview_exposure_time_c.
+ if( camera_settings.has_iso && camera_settings.exposure_time > Math.min(max_preview_exposure_time_c, 1000000000L/12) )
capture_result_exposure_time = camera_settings.exposure_time;
if( capture_result_exposure_time <= 0 ) {
@@ -7811,7 +8683,7 @@ public class CameraController2 extends CameraController {
/*if( MyDebug.LOG ) {
RggbChannelVector vector = result.get(CaptureResult.COLOR_CORRECTION_GAINS);
if( vector != null ) {
- convertRggbToTemperature(vector); // logging will occur in this function
+ convertRggbVectorToTemperature(vector); // logging will occur in this function
}
}*/
}
@@ -7955,7 +8827,10 @@ public class CameraController2 extends CameraController {
}
}
+ callCheckImagesCompleted();
+ }
+ private void callCheckImagesCompleted() {
// Important that we only call the picture onCompleted callback after we've received the capture request, so
// we need to check if we already received all the images.
// Also needs to be run on UI thread.
@@ -8047,5 +8922,5 @@ public class CameraController2 extends CameraController {
handleCaptureBurstInProgress(result);
}
}
- };
+ }
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraControllerManager2.java b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraControllerManager2.java
index 13b4a980970204a8a6cbfce02210d1351c246b42..f2d17cf7664b3810af09976fcebffdcf263a3b9c 100644
--- a/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraControllerManager2.java
+++ b/app/src/main/java/net/sourceforge/opencamera/cameracontroller/CameraControllerManager2.java
@@ -121,6 +121,15 @@ public class CameraControllerManager2 extends CameraControllerManager {
SizeF physical_size = characteristics.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
android.util.Size pixel_size = characteristics.get(CameraCharacteristics.SENSOR_INFO_PIXEL_ARRAY_SIZE);
float [] focal_lengths = characteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
+ if( active_size == null || physical_size == null || pixel_size == null || focal_lengths == null || focal_lengths.length == 0 ) {
+ // in theory this should never happen according to the documentation, but I've had a report of physical_size (SENSOR_INFO_PHYSICAL_SIZE)
+ // being null on an EXTERNAL Camera2 device, see https://sourceforge.net/p/opencamera/tickets/754/
+ if( MyDebug.LOG ) {
+ Log.e(TAG, "can't get camera view angles");
+ }
+ // fall back to a default
+ return new SizeF(55.0f, 43.0f);
+ }
//camera_features.view_angle_x = (float)Math.toDegrees(2.0 * Math.atan2(physical_size.getWidth(), (2.0 * focal_lengths[0])));
//camera_features.view_angle_y = (float)Math.toDegrees(2.0 * Math.atan2(physical_size.getHeight(), (2.0 * focal_lengths[0])));
float frac_x = ((float)active_size.width())/(float)pixel_size.getWidth();
diff --git a/app/src/main/java/net/sourceforge/opencamera/preview/ApplicationInterface.java b/app/src/main/java/net/sourceforge/opencamera/preview/ApplicationInterface.java
index d7a3f4ce8d42bf70f20c06e0fe76cfbbba58ba93..a367b441a9fe3f22cfb1cdd5f2a00c45b99c99f6 100644
--- a/app/src/main/java/net/sourceforge/opencamera/preview/ApplicationInterface.java
+++ b/app/src/main/java/net/sourceforge/opencamera/preview/ApplicationInterface.java
@@ -9,10 +9,13 @@ import android.content.Context;
import android.graphics.Canvas;
import android.location.Location;
import android.net.Uri;
+import android.os.Build;
import android.util.Log;
import android.util.Pair;
import android.view.MotionEvent;
+import androidx.annotation.RequiresApi;
+
import net.sourceforge.opencamera.MyDebug;
import net.sourceforge.opencamera.cameracontroller.CameraController;
import net.sourceforge.opencamera.cameracontroller.RawImage;
@@ -31,18 +34,22 @@ public interface ApplicationInterface {
public boolean auto_restart; // whether to automatically restart on hitting max filesize (this setting is still relevant for max_filesize==0, as typically there will still be a device max filesize)
}
- int VIDEOMETHOD_FILE = 0; // video will be saved to a file
- int VIDEOMETHOD_SAF = 1; // video will be saved using Android 5's Storage Access Framework
- int VIDEOMETHOD_URI = 2; // video will be written to the supplied Uri
+ enum VideoMethod {
+ FILE, // video will be saved to a file
+ SAF, // video will be saved using Android 5's Storage Access Framework
+ MEDIASTORE, // video will be saved to the supplied MediaStore Uri
+ URI // video will be written to the supplied Uri
+ }
// methods that request information
Context getContext(); // get the application context
boolean useCamera2(); // should Android 5's Camera 2 API be used?
Location getLocation(); // get current location - null if not available (or you don't care about geotagging)
- int createOutputVideoMethod(); // return a VIDEOMETHOD_* value to specify how to create a video file
- File createOutputVideoFile(String extension) throws IOException; // will be called if createOutputVideoUsingSAF() returns VIDEOMETHOD_FILE; extension is the recommended filename extension for the chosen video type
- Uri createOutputVideoSAF(String extension) throws IOException; // will be called if createOutputVideoUsingSAF() returns VIDEOMETHOD_SAF; extension is the recommended filename extension for the chosen video type
- Uri createOutputVideoUri(); // will be called if createOutputVideoUsingSAF() returns VIDEOMETHOD_URI
+ VideoMethod createOutputVideoMethod(); // return a VideoMethod value to specify how to create a video file
+ File createOutputVideoFile(String extension) throws IOException; // will be called if createOutputVideoUsingSAF() returns VideoMethod.FILE; extension is the recommended filename extension for the chosen video type
+ Uri createOutputVideoSAF(String extension) throws IOException; // will be called if createOutputVideoUsingSAF() returns VideoMethod.SAF; extension is the recommended filename extension for the chosen video type
+ Uri createOutputVideoMediaStore(String extension) throws IOException; // will be called if createOutputVideoUsingSAF() returns VideoMethod.MEDIASTORE; extension is the recommended filename extension for the chosen video type
+ Uri createOutputVideoUri(); // will be called if createOutputVideoUsingSAF() returns VideoMethod.URI
// for all of the get*Pref() methods, you can use Preview methods to get the supported values (e.g., getSupportedSceneModes())
// if you just want a default or don't really care, see the comments for each method for a default or possible options
// if Preview doesn't support the requested setting, it will check this, and choose its own
@@ -117,7 +124,6 @@ public interface ApplicationInterface {
boolean getVideoFlashPref(); // option to switch flash on/off while recording video (should be false in most cases!)
boolean getVideoLowPowerCheckPref(); // whether to stop video automatically on critically low battery
String getPreviewSizePref(); // "preference_preview_size_wysiwyg" is recommended (preview matches aspect ratio of photo resolution as close as possible), but can also be "preference_preview_size_display" to maximise the preview size
- String getPreviewRotationPref(); // return "0" for default; use "180" to rotate the preview 180 degrees
String getLockOrientationPref(); // return "none" for default; use "portrait" or "landscape" to lock photos/videos to that orientation
boolean getTouchCapturePref(); // whether to enable touch to capture
boolean getDoubleTapCapturePref(); // whether to enable double-tap to capture
@@ -133,10 +139,11 @@ public interface ApplicationInterface {
boolean getRecordAudioPref(); // whether to record audio when recording video
String getRecordAudioChannelsPref(); // either "audio_default", "audio_mono" or "audio_stereo"
String getRecordAudioSourcePref(); // "audio_src_camcorder" is recommended, but other options are: "audio_src_mic", "audio_src_default", "audio_src_voice_communication", "audio_src_unprocessed" (unprocessed required Android 7+); see corresponding values in android.media.MediaRecorder.AudioSource
- int getZoomPref(); // index into Preview.getSupportedZoomRatios() array (each entry is the zoom factor, scaled by 100; array is sorted from min to max zoom)
+ int getZoomPref(); // index into Preview.getSupportedZoomRatios() array (each entry is the zoom factor, scaled by 100; array is sorted from min to max zoom); return -1 for default 1x zoom
double getCalibratedLevelAngle(); // set to non-zero to calibrate the accelerometer used for the level angles
boolean canTakeNewPhoto(); // whether taking new photos is allowed (e.g., can return false if queue for processing images would become full)
boolean imageQueueWouldBlock(int n_raw, int n_jpegs); // called during some burst operations, whether we can allow taking the supplied number of extra photos
+ int getDisplayRotation(); // same behaviour as Activity.getWindowManager().getDefaultDisplay().getRotation() (including returning a member of Surface.ROTATION_*), but allows application to modify e.g. for upside-down preview
// Camera2 only modes:
long getExposureTimePref(); // only called if getISOPref() is not "default"
float getFocusDistancePref(boolean is_target_distance);
@@ -154,6 +161,9 @@ public interface ApplicationInterface {
NRMODE_LOW_LIGHT
}
NRModePref getNRModePref(); // only relevant if getBurstForNoiseReduction() returns true; if this changes without reopening the preview's camera, call Preview.setupBurstMode()
+ boolean isCameraExtensionPref(); // whether to use camera vendor extension (see https://developer.android.com/reference/android/hardware/camera2/CameraExtensionCharacteristics )
+ @RequiresApi(api = Build.VERSION_CODES.S)
+ int getCameraExtensionPref(); // if isCameraExtensionPref() returns true, the camera extension mode to use
float getAperturePref(); // get desired aperture (called if Preview.getSupportedApertures() returns non-null); return -1.0f for no preference
boolean getOptimiseAEForDROPref(); // see CameraController doc for setOptimiseAEForDRO().
enum RawPref {
@@ -162,10 +172,11 @@ public interface ApplicationInterface {
}
RawPref getRawPref(); // whether to enable RAW photos
int getMaxRawImages(); // see documentation of CameraController.setRaw(), corresponds to max_raw_images
+ boolean useCamera2DummyCaptureHack(); // whether to enable CameraController.setDummyCaptureHack() for Camera2 API
boolean useCamera2FakeFlash(); // whether to enable CameraController.setUseCamera2FakeFlash() for Camera2 API
boolean useCamera2FastBurst(); // whether to enable Camera2's captureBurst() for faster taking of expo-bracketing photos (generally should be true, but some devices have problems with captureBurst())
boolean usePhotoVideoRecording(); // whether to enable support for taking photos when recording video (if not supported, this won't be called)
- boolean isPreviewInBackground(); // if true, then Preview can disable real-time effects (e.g., computing histogram)
+ boolean isPreviewInBackground(); // if true, then Preview can disable real-time effects (e.g., computing histogram); also it won't try to open the camera when in the background
boolean allowZoom(); // if false, don't allow zoom functionality even if the device supports it - Preview.supportsZoom() will also return false; if true, allow zoom if the device supports it
// for testing purposes:
@@ -177,9 +188,9 @@ public interface ApplicationInterface {
void startingVideo(); // called just before video recording starts
void startedVideo(); // called just after video recording starts
void stoppingVideo(); // called just before video recording stops; note that if startingVideo() is called but then video recording fails to start, this method will still be called, but startedVideo() and stoppedVideo() won't be called
- void stoppedVideo(final int video_method, final Uri uri, final String filename); // called after video recording stopped (uri/filename will be null if video is corrupt or not created); will be called iff startedVideo() was called
- void restartedVideo(final int video_method, final Uri uri, final String filename); // called after a seamless restart (supported on Android 8+) has occurred - in this case stoppedVideo() is only called for the final video file; this method is instead called for all earlier video file segments
- void deleteUnusedVideo(final int video_method, final Uri uri, final String filename); // application should delete the requested video (which will correspond to a video file previously returned via the createOutputVideo*() methods), either because it is corrupt or unused
+ void stoppedVideo(final VideoMethod video_method, final Uri uri, final String filename); // called after video recording stopped (uri/filename will be null if video is corrupt or not created); will be called iff startedVideo() was called
+ void restartedVideo(final VideoMethod video_method, final Uri uri, final String filename); // called after a seamless restart (supported on Android 8+) has occurred - in this case stoppedVideo() is only called for the final video file; this method is instead called for all earlier video file segments
+ void deleteUnusedVideo(final VideoMethod video_method, final Uri uri, final String filename); // application should delete the requested video (which will correspond to a video file previously returned via the createOutputVideo*() methods), either because it is corrupt or unused
void onFailedStartPreview(); // called if failed to start camera preview
void onCameraError(); // called if the camera closes due to serious error.
void onPhotoError(); // callback for failing to take a photo
@@ -197,6 +208,7 @@ public interface ApplicationInterface {
// methods that request actions
void multitouchZoom(int new_zoom); // indicates that the zoom has changed due to multitouch gesture on preview
+ void requestTakePhoto(); // requesting taking a photo (due to single/double tap, if either getTouchCapturePref(), getDoubleTouchCapturePref() options are enabled)
// the set/clear*Pref() methods are called if Preview decides to override the requested pref (because Camera device doesn't support requested pref) (clear*Pref() is called if the feature isn't supported at all)
// the application can use this information to update its preferences
void setCameraIdPref(int cameraId);
diff --git a/app/src/main/java/net/sourceforge/opencamera/preview/BasicApplicationInterface.java b/app/src/main/java/net/sourceforge/opencamera/preview/BasicApplicationInterface.java
index 58395005f0ba191aca7fd2730ec57092dc69d885..b12b7b144bf9088846448167efdaf058397728f7 100644
--- a/app/src/main/java/net/sourceforge/opencamera/preview/BasicApplicationInterface.java
+++ b/app/src/main/java/net/sourceforge/opencamera/preview/BasicApplicationInterface.java
@@ -3,12 +3,16 @@ package net.sourceforge.opencamera.preview;
import java.util.Date;
import java.util.List;
+import android.app.Activity;
import android.graphics.Canvas;
import android.location.Location;
import android.net.Uri;
+import android.os.Build;
import android.util.Pair;
import android.view.MotionEvent;
+import androidx.annotation.RequiresApi;
+
import net.sourceforge.opencamera.cameracontroller.CameraController;
import net.sourceforge.opencamera.cameracontroller.RawImage;
@@ -188,11 +192,6 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
return "preference_preview_size_wysiwyg";
}
- @Override
- public String getPreviewRotationPref() {
- return "0";
- }
-
@Override
public String getLockOrientationPref() {
return "none";
@@ -270,7 +269,7 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
@Override
public int getZoomPref() {
- return 0;
+ return -1;
}
@Override
@@ -288,6 +287,12 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
return false;
}
+ @Override
+ public int getDisplayRotation() {
+ Activity activity = (Activity)this.getContext();
+ return activity.getWindowManager().getDefaultDisplay().getRotation();
+ }
+
@Override
public long getExposureTimePref() {
return CameraController.EXPOSURE_TIME_DEFAULT;
@@ -348,6 +353,17 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
return NRModePref.NRMODE_NORMAL;
}
+ @Override
+ public boolean isCameraExtensionPref() {
+ return false;
+ }
+
+ @Override
+ @RequiresApi(api = Build.VERSION_CODES.S)
+ public int getCameraExtensionPref() {
+ return 0;
+ }
+
@Override
public float getAperturePref() {
return -1.0f;
@@ -368,6 +384,11 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
return 2;
}
+ @Override
+ public boolean useCamera2DummyCaptureHack() {
+ return false;
+ }
+
@Override
public boolean useCamera2FakeFlash() {
return false;
@@ -424,16 +445,16 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
}
@Override
- public void stoppedVideo(int video_method, Uri uri, String filename) {
+ public void stoppedVideo(VideoMethod video_method, Uri uri, String filename) {
}
@Override
- public void restartedVideo(final int video_method, final Uri uri, final String filename) {
+ public void restartedVideo(final VideoMethod video_method, final Uri uri, final String filename) {
}
@Override
- public void deleteUnusedVideo(final int video_method, final Uri uri, final String filename) {
+ public void deleteUnusedVideo(final VideoMethod video_method, final Uri uri, final String filename) {
}
@Override
@@ -511,6 +532,10 @@ public abstract class BasicApplicationInterface implements ApplicationInterface
}
+ @Override
+ public void requestTakePhoto() {
+ }
+
@Override
public void setCameraIdPref(int cameraId) {
diff --git a/app/src/main/java/net/sourceforge/opencamera/preview/Preview.java b/app/src/main/java/net/sourceforge/opencamera/preview/Preview.java
index 04ef327e644d7bd2aa0d56ca9c0c2fc868e32719..f2e621d7913cae46014c32cd72220d96123ff003 100644
--- a/app/src/main/java/net/sourceforge/opencamera/preview/Preview.java
+++ b/app/src/main/java/net/sourceforge/opencamera/preview/Preview.java
@@ -53,14 +53,13 @@ import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Matrix;
-import android.graphics.Paint;
import android.graphics.Point;
import android.graphics.Rect;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
-import android.graphics.Typeface;
import android.hardware.SensorEvent;
import android.hardware.SensorManager;
+import android.hardware.camera2.CameraExtensionCharacteristics;
import android.location.Location;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
@@ -79,10 +78,14 @@ import android.renderscript.RenderScript;
import android.renderscript.Type;
import androidx.annotation.RequiresApi;
import androidx.core.content.ContextCompat;
+import androidx.annotation.NonNull;
+import androidx.core.content.ContextCompat;
import android.util.Log;
import android.util.Pair;
import android.view.Display;
import android.view.GestureDetector;
+import android.view.Gravity;
+import android.view.LayoutInflater;
import android.view.MotionEvent;
import android.view.OrientationEventListener;
import android.view.ScaleGestureDetector;
@@ -94,18 +97,39 @@ import android.view.ViewGroup;
import android.view.ViewParent;
import android.view.WindowManager;
import android.view.View.MeasureSpec;
-import android.view.accessibility.AccessibilityManager;
import android.widget.FrameLayout;
+import android.widget.TextView;
import android.widget.Toast;
-/**
- * This class was originally named due to encapsulating the camera preview,
- * but in practice it's grown to more than this, and includes most of the
- * operation of the camera. It exists at a higher level than CameraController
- * (i.e., this isn't merely a low level wrapper to the camera API, but
- * supports much of the Open Camera logic and functionality). Communication to
- * the rest of the application is available through ApplicationInterface.
- * We could probably do with decoupling this class into separate components!
+/** This class was originally named due to encapsulating the camera preview,
+ * but in practice it's grown to more than this, and includes most of the
+ * operation of the camera. It exists at a higher level than CameraController
+ * (i.e., this isn't merely a low level wrapper to the camera API, but
+ * supports much of the Open Camera logic and functionality). Communication to
+ * the rest of the application is available through ApplicationInterface.
+ * We could probably do with decoupling this class into separate components!
+ *
+ * This class also keeps track of various camera parameters, obtained from the
+ * CameraController class. One decision is when certain parameters depend on
+ * others (e.g., some resolutions don't support burst; lots of things don't
+ * support vendor camera extensions). In general we shouldn't do that restriction
+ * at this level, as that can cause problems since at the Application level we
+ * may need to know what features are possible in any mode. E.g., if we said
+ * burst mode isn't supported because we're in a camera extension mode, the user
+ * wouldn't be able to switch to Fast Burst mode because the application thinks
+ * burst isn't available! And also for changing preferences in Settings, we
+ * typically want to show all available settings (e.g., showing RAW if it's
+ * available for the current camera, even if not available in the current mode).
+ * There are some exceptions where we need to restrict at the Preview level, e.g.:
+ * - Resolutions (for burst mode, camera extensions) - though the application
+ * can choose to obtain the full list by calling getSupportedPictureSizes()
+ * with check_supported==false.
+ * - Flash modes (for manual ISO or camera extensions).
+ * - Focus modes (for camera extensions).
+ * Similarly we shouldn't restrict available features at the CameraController
+ * class, except where this is unavoidable due to the Android camera API
+ * behaviour (e.g., for scene modes, it may be that some camera features are
+ * affected).
*/
public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextureListener {
private static final String TAG = "Preview";
@@ -158,7 +182,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
//private boolean ui_placement_right = true;
- private boolean app_is_paused = true;
+ private boolean app_is_paused = true; // whether activity is paused
+ private boolean is_paused = true; // whether Preview.onPause() is called - note this could include the application pausing the preview, even if app_is_paused==false
private boolean has_surface;
private boolean has_aspect_ratio;
private double aspect_ratio;
@@ -179,32 +204,45 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
private boolean is_video;
private volatile MediaRecorder video_recorder; // must be volatile for test project reading the state
private volatile boolean video_start_time_set; // must be volatile for test project reading the state
- private long video_start_time; // when the video recording was started, or last resumed if it's was paused
+ private long video_start_time; // system time when the video recording was started, or last resumed if it was paused
private long video_accumulated_time; // this time should be added to (System.currentTimeMillis() - video_start_time) to find the true video duration, that takes into account pausing/resuming, as well as any auto-restarts from max filesize
+ private long video_time_last_maxfilesize_restart; // when the video last restarted due to maxfilesize (or otherwise 0) - note this is time in ms relative to the recorded video, and not system time
private boolean video_recorder_is_paused; // whether video_recorder is running but has paused
private boolean video_restart_on_max_filesize;
private static final long min_safe_restart_video_time = 1000; // if the remaining max time after restart is less than this, don't restart
-
+ /** Stores the file (or similar) to record a video.
+ * Important to call close() when the video recording is finished, to free up any resources
+ * (e.g., supplied ParcelFileDescriptor).
+ */
private static class VideoFileInfo {
- // stores the file (or similar) to record a video
- private final int video_method;
- private final Uri video_uri; // for VIDEOMETHOD_SAF or VIDEOMETHOD_URI
- private final String video_filename; // for VIDEOMETHOD_FILE
- private final ParcelFileDescriptor video_pfd_saf; // for VIDEOMETHOD_SAF
+ private final ApplicationInterface.VideoMethod video_method;
+ private final Uri video_uri; // for VideoMethod.SAF, VideoMethod.URI or VideoMethod.MEDIASTORE
+ private final String video_filename; // for VideoMethod.FILE
+ private final ParcelFileDescriptor video_pfd_saf; // for VideoMethod.SAF, VideoMethod.URI or VideoMethod.MEDIASTORE
VideoFileInfo() {
- this.video_method = ApplicationInterface.VIDEOMETHOD_FILE;
+ this.video_method = ApplicationInterface.VideoMethod.FILE;
this.video_uri = null;
this.video_filename = null;
this.video_pfd_saf = null;
}
-
- VideoFileInfo(int video_method, Uri video_uri, String video_filename, ParcelFileDescriptor video_pfd_saf) {
+ VideoFileInfo(ApplicationInterface.VideoMethod video_method, Uri video_uri, String video_filename, ParcelFileDescriptor video_pfd_saf) {
this.video_method = video_method;
this.video_uri = video_uri;
this.video_filename = video_filename;
this.video_pfd_saf = video_pfd_saf;
}
+
+ void close() {
+ if( this.video_pfd_saf != null ) {
+ try {
+ this.video_pfd_saf.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ }
+ }
}
private VideoFileInfo videoFileInfo = new VideoFileInfo();
@@ -301,6 +339,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
private ApplicationInterface.CameraResolutionConstraints photo_size_constraints;
private int current_size_index = -1; // this is an index into the sizes array, or -1 if sizes not yet set
+ public List supported_extensions; // if non-null, list of supported camera vendor extensions, see https://developer.android.com/reference/android/hardware/camera2/CameraExtensionCharacteristics
+
private boolean supports_video;
private boolean has_capture_rate_factor; // whether we have a capture rate for faster (timelapse) or slow motion
private float capture_rate_factor = 1.0f; // should be 1.0f if has_capture_rate_factor is false; set lower than 1 for slow motion, higher than 1 for timelapse
@@ -320,7 +360,6 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
private boolean using_face_detection;
private CameraController.Face[] faces_detected;
private final RectF face_rect = new RectF();
- private final AccessibilityManager accessibility_manager;
private boolean supports_optical_stabilization;
private boolean supports_video_stabilization;
private boolean supports_photo_video_recording;
@@ -329,8 +368,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
private boolean supports_tonemap_curve;
private float[] supported_apertures;
private boolean has_focus_area;
- private int focus_screen_x;
- private int focus_screen_y;
+ private float focus_camera_x;
+ private float focus_camera_y;
private long focus_complete_time = -1;
private long focus_started_time = -1;
private int focus_success = FOCUS_DONE;
@@ -388,6 +427,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
public volatile int count_cameraContinuousFocusMoving;
public volatile boolean test_fail_open_camera;
public volatile boolean test_video_failure;
+ public volatile boolean test_video_ioexception;
+ public volatile boolean test_video_cameracontrollerexception;
public volatile boolean test_ticker_called; // set from MySurfaceView or CanvasView
public volatile boolean test_called_next_output_file;
public volatile boolean test_started_next_output_file;
@@ -439,7 +480,6 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
gestureDetector = new GestureDetector(getContext(), new GestureDetector.SimpleOnGestureListener());
gestureDetector.setOnDoubleTapListener(new DoubleTapListener());
scaleGestureDetector = new ScaleGestureDetector(getContext(), new ScaleListener());
- accessibility_manager = (AccessibilityManager) activity.getSystemService(Activity.ACCESSIBILITY_SERVICE);
parent.addView(cameraSurface.getView());
if (canvasView != null) {
@@ -484,8 +524,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
boolean mirror = (camera_controller.getFacing() == CameraController.Facing.FACING_FRONT);
camera_to_preview_matrix.setScale(mirror ? -1 : 1, 1);
int display_orientation = camera_controller.getDisplayOrientation();
- if (MyDebug.LOG) {
- Log.d(TAG, "orientation of display relative to camera orientaton: " + display_orientation);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "orientation of display relative to camera orientation: " + display_orientation);
}
camera_to_preview_matrix.postRotate(display_orientation);
} else {
@@ -496,9 +536,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
camera_to_preview_matrix.setScale(1, mirror ? -1 : 1);
int degrees = getDisplayRotationDegrees();
int result = (camera_controller.getCameraOrientation() - degrees + 360) % 360;
- if (MyDebug.LOG) {
- Log.d(TAG, "orientation of display relative to natural orientaton: " + degrees);
- Log.d(TAG, "orientation of display relative to camera orientaton: " + result);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "orientation of display relative to natural orientation: " + degrees);
+ Log.d(TAG, "orientation of display relative to camera orientation: " + result);
}
camera_to_preview_matrix.postRotate(result);
}
@@ -528,16 +568,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return preview_to_camera_matrix;
}*/
- private ArrayList getAreas(float x, float y) {
- float[] coords = {x, y};
- calculatePreviewToCameraMatrix();
- preview_to_camera_matrix.mapPoints(coords);
- float focus_x = coords[0];
- float focus_y = coords[1];
-
+ /** Return a focus area from supplied point. Supplied coordinates should be in camera
+ * coordinates.
+ */
+ private ArrayList getAreas(float focus_x, float focus_y) {
int focus_size = 50;
- if (MyDebug.LOG) {
- Log.d(TAG, "x, y: " + x + ", " + y);
+ if( MyDebug.LOG ) {
Log.d(TAG, "focus x, y: " + focus_x + ", " + focus_y);
}
Rect rect = new Rect();
@@ -614,10 +650,6 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
if (touch_was_multitouch) {
return true;
}
- if (!this.is_video && this.isTakingPhotoOrOnTimer()) {
- // if video, okay to refocus when recording
- return true;
- }
// ignore swipes
{
@@ -640,6 +672,24 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
+ if( takePhotoOnDoubleTap() ) {
+ // need to wait until onSingleTapConfirmed() before calling handleSingleTouch(), e.g., don't
+ // want to do touch-to-focus if this is part of a double tap
+ return true;
+ }
+
+ return handleSingleTouch(event, was_paused);
+ }
+
+ private boolean handleSingleTouch(MotionEvent event, boolean was_paused) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "handleSingleTouch");
+
+ if( !this.is_video && this.isTakingPhotoOrOnTimer() ) {
+ // if video, okay to refocus when recording
+ return true;
+ }
+
// note, we always try to force start the preview (in case is_preview_paused has become false)
// except if recording video (firstly, the preview should be running; secondly, we don't want to reset the phase!)
if (!this.is_video) {
@@ -647,29 +697,47 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
cancelAutoFocus();
+ boolean touch_capture = applicationInterface.getTouchCapturePref();
+
// don't set focus areas on touch if the user is touching to unpause!
- if (camera_controller != null && !this.using_face_detection && !was_paused) {
+ // similarly if doing single touch to capture (we go straight to taking a photo)
+ // and not supported for camera extensions
+ if( camera_controller != null && !this.using_face_detection && !was_paused && !touch_capture && !camera_controller.isCameraExtension() ) {
this.has_focus_area = false;
- ArrayList areas = getAreas(event.getX(), event.getY());
- if (camera_controller.setFocusAndMeteringArea(areas)) {
- if (MyDebug.LOG)
+
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "x, y: " + event.getX() + ", " + event.getY());
+ }
+ float [] coords = {event.getX(), event.getY()};
+ calculatePreviewToCameraMatrix();
+ preview_to_camera_matrix.mapPoints(coords);
+ float focus_x = coords[0];
+ float focus_y = coords[1];
+ ArrayList areas = getAreas(focus_x, focus_y);
+
+ if( camera_controller.setFocusAndMeteringArea(areas) ) {
+ if( MyDebug.LOG )
Log.d(TAG, "set focus (and metering?) area");
this.has_focus_area = true;
- this.focus_screen_x = (int) event.getX();
- this.focus_screen_y = (int) event.getY();
- } else {
- if (MyDebug.LOG)
+ this.focus_camera_x = focus_x;
+ this.focus_camera_y = focus_y;
+ }
+ else {
+ if( MyDebug.LOG )
Log.d(TAG, "didn't set focus area in this mode, may have set metering");
// don't set has_focus_area in this mode
}
}
// don't take a photo on touch if the user is touching to unpause!
- if (!this.is_video && !was_paused && applicationInterface.getTouchCapturePref()) {
- if (MyDebug.LOG)
+ if( !this.is_video && !was_paused && touch_capture ) {
+ if( MyDebug.LOG )
Log.d(TAG, "touch to capture");
- // interpret as if user had clicked take photo/video button, except that we set the focus/metering areas
- this.takePicturePressed(false, false);
+ // Interpret as if user had clicked take photo/video button, except that we set the focus/metering areas.
+ // We go via ApplicationInterface instead of going direct to Preview.takePicturePressed(), so that
+ // the application can handle same as if user had pressed shutter button (needed so that this works
+ // correctly in Panorama mode).
+ applicationInterface.requestTakePhoto();
return true;
}
@@ -682,10 +750,18 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
//@SuppressLint("ClickableViewAccessibility") @Override
- /**
- * Handle multitouch zoom.
+ // When pinch zooming, we'd normally have the problem that zooming is too fast, because we can
+ // only zoom to the limited set of values in the zoom_ratios array. So when pinch zooming, we
+ // keep track of the fractional scaled zoom.
+ private boolean has_smooth_zoom = false;
+ private float smooth_zoom = 1.0f;
+
+ /** Handle multitouch zoom.
*/
private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
+ private boolean has_multitouch_start_zoom_factor = false;
+ private int multitouch_start_zoom_factor = 0;
+
@Override
public boolean onScale(ScaleGestureDetector detector) {
if (Preview.this.camera_controller != null && Preview.this.has_zoom) {
@@ -693,26 +769,106 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
return true;
}
+
+ @Override
+ public boolean onScaleBegin(ScaleGestureDetector detector) {
+ if( has_zoom && camera_controller != null ) {
+ has_multitouch_start_zoom_factor = true;
+ multitouch_start_zoom_factor = camera_controller.getZoom();
+ has_smooth_zoom = true;
+ smooth_zoom = zoom_ratios.get(multitouch_start_zoom_factor)/100.0f;
+ }
+ else {
+ has_multitouch_start_zoom_factor = false;
+ multitouch_start_zoom_factor = 0;
+ has_smooth_zoom = false;
+ smooth_zoom = 1.0f;
+ }
+ return true;
+ }
+
+ @Override
+ public void onScaleEnd(ScaleGestureDetector detector) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onScaleEnd");
+ if( has_multitouch_start_zoom_factor && has_zoom && camera_controller != null && zoom_ratios.get(0) < 100 ) {
+ // when the minimum zoom is less than 1x, we should support snapping to 1x, so it's easy for the user to
+ // switch back to 1x zoom when using pinch zoom
+ int start_zoom = zoom_ratios.get(multitouch_start_zoom_factor);
+ final int end_zoom_factor = camera_controller.getZoom();
+ int end_zoom = zoom_ratios.get(end_zoom_factor);
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "start_zoom: " + start_zoom);
+ Log.d(TAG, "end_zoom : " + end_zoom);
+ }
+ if( end_zoom >= 90 && end_zoom <= 110 && start_zoom != 100 && end_zoom != 100 ) {
+ int start_diff = start_zoom - 100;
+ int end_diff = end_zoom - 100;
+ if( Math.signum(start_diff) == Math.signum(end_diff) && Math.abs(end_diff) >= Math.abs(start_diff) ) {
+ // we only want to snap when moving towards 1x, or have crossed over 1x
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "snapped pinch zoom to 1x zoom");
+ int snapped_zoom = find1xZoom();
+ zoomTo(snapped_zoom);
+ }
+ }
+ }
+ has_multitouch_start_zoom_factor = false;
+ multitouch_start_zoom_factor = 0;
+ has_smooth_zoom = false;
+ smooth_zoom = 1.0f;
+ }
+ }
+
+ /** Returns whether we will take a photo on a double tap.
+ */
+ private boolean takePhotoOnDoubleTap() {
+ return !is_video && applicationInterface.getDoubleTapCapturePref();
}
@SuppressWarnings("SameReturnValue")
public boolean onDoubleTap() {
if (MyDebug.LOG)
Log.d(TAG, "onDoubleTap()");
- if (!is_video && applicationInterface.getDoubleTapCapturePref()) {
- if (MyDebug.LOG)
+ if( takePhotoOnDoubleTap() ) {
+ if( MyDebug.LOG )
Log.d(TAG, "double-tap to capture");
- // interpret as if user had clicked take photo/video button (don't need to set focus/metering, as this was done in touchEvent() for the first touch of the double-tap)
- takePicturePressed(false, false);
+ // Interpret as if user had clicked take photo/video button.
+ // We go via ApplicationInterface instead of going direct to Preview.takePicturePressed(), so that
+ // the application can handle same as if user had pressed shutter button (needed so that this works
+ // correctly in Panorama mode).
+ applicationInterface.requestTakePhoto();
}
return true;
}
private class DoubleTapListener extends GestureDetector.SimpleOnGestureListener {
+ @Override
+ public boolean onSingleTapConfirmed(MotionEvent e) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onSingleTapConfirmed");
+ // If we're taking a photo on double tap, then for single taps we need to wait until these are confirmed
+ // otherwise we handle via Preview.touchEvent().
+ // Arguably we could handle everything via onSingleTapConfirmed(), but want to avoid
+ // unexpected changes of behaviour - plus it would mean a slight delay for touch to
+ // focus (since onSingleTapConfirmed obviously has to wait to be sure this isn't a
+ // double tap).
+ if( takePhotoOnDoubleTap() ) {
+ // now safe to handle the single touch
+ boolean was_paused = !is_preview_started;
+ if( MyDebug.LOG )
+ Log.d(TAG, "was_paused: " + was_paused);
+ return handleSingleTouch(e, was_paused);
+ }
+ return false;
+ }
+
@Override
public boolean onDoubleTap(MotionEvent e) {
- if (MyDebug.LOG)
- Log.d(TAG, "onDoubleTap()");
+ if( MyDebug.LOG )
+ Log.d(TAG, "onDoubleTap");
return Preview.this.onDoubleTap();
}
}
@@ -726,7 +882,10 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return;
}
// don't cancelAutoFocus() here, otherwise we get sluggish zoom behaviour on Camera2 API
- camera_controller.clearFocusAndMetering();
+ if( !camera_controller.isCameraExtension() ) {
+ // if using camera extensions, we could never have set focus and metering in the first place
+ camera_controller.clearFocusAndMetering();
+ }
has_focus_area = false;
focus_success = FOCUS_DONE;
successfully_focused = false;
@@ -806,8 +965,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void surfaceCreated(SurfaceHolder holder) {
- if (MyDebug.LOG)
+ public void surfaceCreated(@NonNull SurfaceHolder holder) {
+ if( MyDebug.LOG )
Log.d(TAG, "surfaceCreated()");
// The Surface has been created, acquire the camera and tell it where
// to draw.
@@ -816,8 +975,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void surfaceDestroyed(SurfaceHolder holder) {
- if (MyDebug.LOG)
+ public void surfaceDestroyed(@NonNull SurfaceHolder holder) {
+ if( MyDebug.LOG )
Log.d(TAG, "surfaceDestroyed()");
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
@@ -826,8 +985,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
- if (MyDebug.LOG)
+ public void surfaceChanged(@NonNull SurfaceHolder holder, int format, int w, int h) {
+ if( MyDebug.LOG )
Log.d(TAG, "surfaceChanged " + w + ", " + h);
if (holder.getSurface() == null) {
// preview surface does not exist
@@ -837,8 +996,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void onSurfaceTextureAvailable(SurfaceTexture arg0, int width, int height) {
- if (MyDebug.LOG)
+ public void onSurfaceTextureAvailable(@NonNull SurfaceTexture arg0, int width, int height) {
+ if( MyDebug.LOG )
Log.d(TAG, "onSurfaceTextureAvailable()");
this.set_textureview_size = true;
this.textureview_w = width;
@@ -847,8 +1006,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public boolean onSurfaceTextureDestroyed(SurfaceTexture arg0) {
- if (MyDebug.LOG)
+ public boolean onSurfaceTextureDestroyed(@NonNull SurfaceTexture arg0) {
+ if( MyDebug.LOG )
Log.d(TAG, "onSurfaceTextureDestroyed()");
this.set_textureview_size = false;
this.textureview_w = 0;
@@ -858,8 +1017,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
- if (MyDebug.LOG) {
+ public void onSurfaceTextureSizeChanged(@NonNull SurfaceTexture texture, int width, int height) {
+ if( MyDebug.LOG ) {
Log.d(TAG, "onSurfaceTextureSizeChanged " + width + ", " + height);
//Log.d(TAG, "surface texture is now: " + ((TextureView)cameraSurface).getSurfaceTexture());
}
@@ -893,7 +1052,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
@Override
- public void onSurfaceTextureUpdated(SurfaceTexture arg0) {
+ public void onSurfaceTextureUpdated(@NonNull SurfaceTexture arg0) {
refreshPreviewBitmap();
}
@@ -907,13 +1066,13 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
if (MyDebug.LOG)
Log.d(TAG, "textureview size: " + textureview_w + ", " + textureview_h);
- int rotation = getDisplayRotation();
+ int rotation = applicationInterface.getDisplayRotation();
Matrix matrix = new Matrix();
RectF viewRect = new RectF(0, 0, this.textureview_w, this.textureview_h);
RectF bufferRect = new RectF(0, 0, this.preview_h, this.preview_w);
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();
- if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
+ if( rotation == Surface.ROTATION_90 || rotation == Surface.ROTATION_270 ) {
bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
float scale = Math.max(
@@ -922,6 +1081,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
}
+ else if( rotation == Surface.ROTATION_180 ) {
+ matrix.postRotate(180, centerX, centerY);
+ }
cameraSurface.setTransform(matrix);
}
@@ -967,9 +1129,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
// stop() can throw a RuntimeException if stop is called too soon after start - this indicates the video file is corrupt, and should be deleted
if (MyDebug.LOG)
Log.d(TAG, "runtime exception when stopping video");
+ videoFileInfo.close();
applicationInterface.deleteUnusedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
videoFileInfo = new VideoFileInfo();
+ if( nextVideoFileInfo != null )
+ nextVideoFileInfo.close();
nextVideoFileInfo = null;
// if video recording is stopped quickly after starting, it's normal that we might not have saved a valid file, so no need to display a message
if (!video_start_time_set || System.currentTimeMillis() - video_start_time > 2000) {
@@ -992,11 +1157,15 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
video_recorder_is_paused = false;
applicationInterface.cameraInOperation(false, true);
reconnectCamera(false); // n.b., if something went wrong with video, then we reopen the camera - which may fail (or simply not reopen, e.g., if app is now paused)
+ videoFileInfo.close();
applicationInterface.stoppedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
if (nextVideoFileInfo != null) {
// if nextVideoFileInfo is not-null, it means we received MEDIA_RECORDER_INFO_MAX_FILESIZE_APPROACHING but not
// MEDIA_RECORDER_INFO_NEXT_OUTPUT_FILE_STARTED, so it is the application responsibility to create the zero-size
// video file that will have been created
+ if( MyDebug.LOG )
+ Log.d(TAG, "delete ununused next video file");
+ nextVideoFileInfo.close();
applicationInterface.deleteUnusedVideo(nextVideoFileInfo.video_method, nextVideoFileInfo.video_uri, nextVideoFileInfo.video_filename);
}
videoFileInfo = new VideoFileInfo();
@@ -1007,8 +1176,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return applicationInterface.getContext();
}
- /**
- * Restart video - either due to hitting maximum filesize, or maximum duration.
+ /** Restart video - either due to hitting maximum filesize (for pre-Android 8 when not able to restart seamlessly), or maximum duration.
*/
private void restartVideo(boolean due_to_max_filesize) {
if (MyDebug.LOG)
@@ -1159,6 +1327,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
+ /** Closes the camera.
+ * @param async Whether to close the camera on a background thread.
+ * @param closeCameraCallback If async is true, closeCameraCallback.onClosed() will be called,
+ * from the UI thread, once the camera is closed. If async is false,
+ * this field is ignored.
+ */
private void closeCamera(boolean async, final CloseCameraCallback closeCameraCallback) {
long debug_time = 0;
if (MyDebug.LOG) {
@@ -1329,8 +1503,15 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "openCamera()");
debug_time = System.currentTimeMillis();
}
- if (camera_open_state == CameraOpenState.CAMERAOPENSTATE_OPENING) {
- if (MyDebug.LOG)
+ if( applicationInterface.isPreviewInBackground() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "don't open camera as preview in background");
+ // note, even if the application never tries to reopen the camera in the background, we still need this check to avoid the camera
+ // opening from mySurfaceCreated()
+ return;
+ }
+ else if( camera_open_state == CameraOpenState.CAMERAOPENSTATE_OPENING ) {
+ if( MyDebug.LOG )
Log.d(TAG, "already opening camera in background thread");
return;
} else if (camera_open_state == CameraOpenState.CAMERAOPENSTATE_CLOSING) {
@@ -1418,9 +1599,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
return;
}
- if (this.app_is_paused) {
- if (MyDebug.LOG) {
- Log.d(TAG, "don't open camera as app is paused");
+ if( this.is_paused ) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "don't open camera as paused");
}
return;
}
@@ -1800,6 +1981,47 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
this.switchVideo(true, false);
}
+ // seems sensible to set extension mode (or not) first
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.S && this.supported_extensions != null && applicationInterface.isCameraExtensionPref() ) {
+ int extension = applicationInterface.getCameraExtensionPref();
+ if( this.supported_extensions.contains(extension) ) {
+ camera_controller.setCameraExtension(true, extension);
+
+ // also filter unsupported flash modes
+ if( supported_flash_values != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "restrict flash modes for extension session");
+ List new_supported_flash_values = new ArrayList<>();
+ for(String supported_flash_value : supported_flash_values) {
+ switch( supported_flash_value ) {
+ case "flash_off":
+ case "flash_frontscreen_torch":
+ new_supported_flash_values.add(supported_flash_value);
+ break;
+ }
+ }
+ supported_flash_values = new_supported_flash_values;
+ }
+
+ // also disallow focus modes
+ if( supported_focus_values != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "restrict focus modes for extension session");
+ supported_focus_values = null;
+ }
+
+ // and disable ae and awb lock (as normally we don't set this when stopping/starting preview)
+ camera_controller.setAutoExposureLock(false);
+ camera_controller.setAutoWhiteBalanceLock(false);
+ }
+ else {
+ camera_controller.setCameraExtension(false, 0);
+ }
+ }
+ else {
+ camera_controller.setCameraExtension(false, 0);
+ }
+
setupCameraParameters();
updateFlashForVideo();
@@ -1857,45 +2079,56 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
setupBurstMode();
- if (camera_controller.isBurstOrExpo()) {
- if (MyDebug.LOG)
- Log.d(TAG, "check photo resolution supports burst");
- CameraController.Size current_size = getCurrentPictureSize();
- if (MyDebug.LOG && current_size != null) {
- Log.d(TAG, "current_size: " + current_size.width + " x " + current_size.height + " supports_burst? " + current_size.supports_burst);
- }
- if (current_size != null && !current_size.supports_burst) {
- if (MyDebug.LOG)
- Log.d(TAG, "burst mode: current picture size doesn't support burst");
- // set to next largest that supports burst
- CameraController.Size new_size = null;
- for (int i = 0; i < photo_sizes.size(); i++) {
- CameraController.Size size = photo_sizes.get(i);
- if (size.supports_burst && size.width * size.height <= current_size.width * current_size.height) {
- if (new_size == null || size.width * size.height > new_size.width * new_size.height) {
- current_size_index = i;
- new_size = size;
- }
- }
+ {
+ boolean is_burst = camera_controller.isBurstOrExpo();
+ boolean is_extension = camera_controller.isCameraExtension();
+ int extension = is_extension ? camera_controller.getCameraExtension() : -1;
+ if( is_burst || is_extension ) {
+ if( MyDebug.LOG ) {
+ if( is_burst )
+ Log.d(TAG, "check photo resolution supports burst");
+ if( is_extension )
+ Log.d(TAG, "check photo resolution supports extension: " + extension);
}
- if (new_size == null) {
- Log.e(TAG, "can't find burst-supporting picture size smaller than the current picture size");
- // just find largest that supports burst
- for (int i = 0; i < photo_sizes.size(); i++) {
- CameraController.Size size = photo_sizes.get(i);
- if (size.supports_burst) {
- if (new_size == null || size.width * size.height > new_size.width * new_size.height) {
- current_size_index = i;
- new_size = size;
+ CameraController.Size current_size = getCurrentPictureSize();
+ if( current_size != null ) {
+ if( MyDebug.LOG ) {
+ Log.d(TAG, "current_size: " + current_size.width + " x " + current_size.height + " supports_burst? " + current_size.supports_burst);
+ }
+ if( !current_size.supportsRequirements(is_burst, is_extension, extension) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "burst mode: current picture size doesn't support required burst and/or extension");
+ // set to next largest that supports what we need
+ CameraController.Size new_size = null;
+ for(int i=0;i new_size.width*new_size.height ) {
+ current_size_index = i;
+ new_size = size;
+ }
}
}
- }
- if (new_size == null) {
- Log.e(TAG, "can't find burst-supporting picture size");
+ if( new_size == null ) {
+ Log.e(TAG, "can't find supporting picture size smaller than the current picture size");
+ // just find largest that supports requirements
+ for(int i=0;i new_size.width*new_size.height ) {
+ current_size_index = i;
+ new_size = size;
+ }
+ }
+ }
+ if( new_size == null ) {
+ Log.e(TAG, "can't find supporting picture size");
+ }
+ }
+ // if we set a new size, we don't save this to applicationinterface (so that if user switches to a burst mode or extension mode and back
+ // when the original resolution doesn't support burst/extension we revert to the original resolution)
}
}
- // if we set a new size, we don't save this to applicationinterface (so that if user switches to a burst mode and back
- // when the original resolution doesn't support burst we revert to the original resolution)
}
}
@@ -1903,6 +2136,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
// Must set preview size before starting camera preview
// and must do it after setting photo vs video mode
+ // and after setting what camera extension we're using (if any)
setPreviewSize(); // need to call this when we switch cameras, not just when we run for the first time
if (MyDebug.LOG) {
Log.d(TAG, "setupCamera: time after setting preview size: " + (System.currentTimeMillis() - debug_time));
@@ -1915,9 +2149,13 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
// must be done after setting parameters, as this function may set parameters
// also needs to be done after starting preview for some devices (e.g., Nexus 7)
- if (this.has_zoom && applicationInterface.getZoomPref() != 0) {
- zoomTo(applicationInterface.getZoomPref());
- if (MyDebug.LOG) {
+ if( this.has_zoom ) {
+ int zoom_pref = applicationInterface.getZoomPref();
+ if( zoom_pref == -1 ) {
+ zoom_pref = find1xZoom();
+ }
+ zoomTo(zoom_pref);
+ if( MyDebug.LOG ) {
Log.d(TAG, "setupCamera: total time after zoomTo: " + (System.currentTimeMillis() - debug_time));
}
} else if (camera_controller_supports_zoom && !has_zoom) {
@@ -1925,7 +2163,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "camera supports zoom but application disabled zoom, so reset zoom to default");
// if the application switches zoom off via ApplicationInterface.allowZoom(), we need to support
// resetting the zoom (in case the application called setupCamera() rather than reopening the camera).
- camera_controller.setZoom(0);
+ camera_controller.resetZoom();
}
/*if( take_photo ) {
@@ -1976,6 +2214,15 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
+ private int find1xZoom() {
+ for(int i=0;i= Build.VERSION_CODES.JELLY_BEAN && accessibility_manager.isEnabled() && accessibility_manager.isTouchExplorationEnabled()) {
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN
+ ) {
int n_faces = local_faces.length;
FaceLocation face_location = FaceLocation.FACELOCATION_UNKNOWN;
if (n_faces > 0) {
@@ -2189,13 +2436,20 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
float avg_x = 0, avg_y = 0;
final float bdry_frac_c = 0.35f;
boolean all_centre = true;
- for (CameraController.Face face : local_faces) {
- float face_x = face.rect.centerX();
- float face_y = face.rect.centerY();
- face_x /= (float) cameraSurface.getView().getWidth();
- face_y /= (float) cameraSurface.getView().getHeight();
- if (all_centre) {
- if (face_x < bdry_frac_c || face_x > 1.0f - bdry_frac_c || face_y < bdry_frac_c || face_y > 1.0f - bdry_frac_c)
+ final Matrix matrix = getCameraToPreviewMatrix();
+ for(CameraController.Face face : local_faces) {
+ //float face_x = face.rect.centerX();
+ //float face_y = face.rect.centerY();
+ // convert to screen space coordinates
+ face_rect.set(face.rect);
+ matrix.mapRect(face_rect);
+ float face_x = face_rect.centerX();
+ float face_y = face_rect.centerY();
+
+ face_x /= (float)cameraSurface.getView().getWidth();
+ face_y /= (float)cameraSurface.getView().getHeight();
+ if( all_centre ) {
+ if( face_x < bdry_frac_c || face_x > 1.0f-bdry_frac_c || face_y < bdry_frac_c || face_y > 1.0f-bdry_frac_c )
all_centre = false;
}
avg_x += face_x;
@@ -2297,6 +2551,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
camera_controller.setFaceDetectionListener(new MyFaceDetectionListener());
}
+ else {
+ camera_controller.setFaceDetectionListener(null);
+ }
}
if (MyDebug.LOG) {
Log.d(TAG, "setupCameraParameters: time after setting face detection: " + (System.currentTimeMillis() - debug_time));
@@ -2421,7 +2678,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
if (MyDebug.LOG)
Log.d(TAG, "saved iso: " + value);
boolean is_manual_iso = false;
- if (supports_iso_range) {
+ boolean is_extension = camera_controller.isCameraExtension();
+ if( is_extension ) {
+ // manual ISO not supported for camera extensions
+ camera_controller.setManualISO(false, 0);
+ }
+ else if( supports_iso_range ) {
// in this mode, we can set any ISO value from min to max
this.isos = null; // if supports_iso_range==true, caller shouldn't be using getSupportedISOs()
@@ -2651,7 +2913,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
// size set later in setPreviewSize()
- // also note that we check for compatibility with burst (CameraController.Size.supports_burst) later on
+ // also note that we check for compatibility with burst (CameraController.Size.supports_burst) and extensions later on
}
if (MyDebug.LOG) {
Log.d(TAG, "setupCameraParameters: time after picture sizes: " + (System.currentTimeMillis() - debug_time));
@@ -2813,7 +3075,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
// if in manual ISO mode, we'll have restricted the available flash modes - so although we want to
// communicate this to the application, we don't want to save the new value we've chosen (otherwise
// if user goes to manual ISO and back, we might switch saved flash say from auto to off)
- updateFlash(0, !is_manual_iso);
+ // similarly for camera extension modes
+ updateFlash(0, !is_manual_iso && !is_extension );
}
} else {
if (MyDebug.LOG)
@@ -3516,8 +3779,16 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
for (CameraController.Size size : sizes) {
if (MyDebug.LOG)
Log.d(TAG, " supported preview size: " + size.width + ", " + size.height);
- double ratio = (double) size.width / size.height;
- if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE)
+ if( camera_controller.isCameraExtension() ) {
+ int extension = camera_controller.getCameraExtension();
+ if( !size.supportsExtension(extension) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, " not supported by current extension: " + extension);
+ continue;
+ }
+ }
+ double ratio = (double)size.width / size.height;
+ if( Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE )
continue;
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
@@ -3625,46 +3896,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return aspect_ratio;
}
- /**
- * Returns the ROTATION_* enum of the display relative to the natural device orientation.
- */
- public int getDisplayRotation() {
- // gets the display rotation (as a Surface.ROTATION_* constant), taking into account the getRotatePreviewPreferenceKey() setting
- Activity activity = (Activity) this.getContext();
- int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
-
- String rotate_preview = applicationInterface.getPreviewRotationPref();
- if (MyDebug.LOG)
- Log.d(TAG, " rotate_preview = " + rotate_preview);
- if (rotate_preview.equals("180")) {
- switch (rotation) {
- case Surface.ROTATION_0:
- rotation = Surface.ROTATION_180;
- break;
- case Surface.ROTATION_90:
- rotation = Surface.ROTATION_270;
- break;
- case Surface.ROTATION_180:
- rotation = Surface.ROTATION_0;
- break;
- case Surface.ROTATION_270:
- rotation = Surface.ROTATION_90;
- break;
- default:
- break;
- }
- }
-
- return rotation;
- }
-
- /**
- * Returns the rotation in degrees of the display relative to the natural device orientation.
+ /** Returns the rotation in degrees of the display relative to the natural device orientation.
*/
private int getDisplayRotationDegrees() {
if (MyDebug.LOG)
Log.d(TAG, "getDisplayRotationDegrees");
- int rotation = getDisplayRotation();
+ int rotation = applicationInterface.getDisplayRotation();
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
@@ -3805,9 +4042,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
public void draw(Canvas canvas) {
/*if( MyDebug.LOG )
Log.d(TAG, "draw()");*/
- if (this.app_is_paused) {
+ if( this.is_paused ) {
/*if( MyDebug.LOG )
- Log.d(TAG, "draw(): app is paused");*/
+ Log.d(TAG, "draw(): paused");*/
return;
}
/*if( true ) // test
@@ -3832,17 +4069,33 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "getScaledZoomFactor() " + scale_factor);
int new_zoom_factor = 0;
- if (this.camera_controller != null && this.has_zoom) {
- int zoom_factor = camera_controller.getZoom();
- float zoom_ratio = this.zoom_ratios.get(zoom_factor) / 100.0f;
+ if( this.camera_controller != null && this.has_zoom ) {
+ final int zoom_factor = camera_controller.getZoom();
+ float zoom_ratio;
+ if( has_smooth_zoom ) {
+ zoom_ratio = smooth_zoom;
+ if( MyDebug.LOG )
+ Log.d(TAG, " use smooth_zoom: " + smooth_zoom + " instead of: " + this.zoom_ratios.get(zoom_factor)/100.0f);
+ }
+ else {
+ zoom_ratio = this.zoom_ratios.get(zoom_factor)/100.0f;
+ }
zoom_ratio *= scale_factor;
+ if( MyDebug.LOG )
+ Log.d(TAG, " zoom_ratio: " + zoom_ratio);
new_zoom_factor = zoom_factor;
- if (zoom_ratio <= 1.0f) {
+ if( zoom_ratio <= zoom_ratios.get(0)/100.0f ) {
new_zoom_factor = 0;
- } else if (zoom_ratio >= zoom_ratios.get(max_zoom_factor) / 100.0f) {
+ if( has_smooth_zoom )
+ smooth_zoom = zoom_ratios.get(0)/100.0f;
+ }
+ else if( zoom_ratio >= zoom_ratios.get(max_zoom_factor)/100.0f ) {
new_zoom_factor = max_zoom_factor;
- } else {
+ if( has_smooth_zoom )
+ smooth_zoom = zoom_ratios.get(max_zoom_factor)/100.0f;
+ }
+ else {
// find the closest zoom level
if (scale_factor > 1.0f) {
// zooming in
@@ -3865,6 +4118,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
}
+ if( has_smooth_zoom )
+ smooth_zoom = zoom_ratio;
}
if (MyDebug.LOG) {
Log.d(TAG, "zoom_ratio is now " + zoom_ratio);
@@ -3886,6 +4141,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
+ /** Zooms to the supplied index (within the zoom_ratios array).
+ * @param new_zoom_factor The index to zoom to.
+ */
public void zoomTo(int new_zoom_factor) {
if (MyDebug.LOG)
Log.d(TAG, "ZoomTo(): " + new_zoom_factor);
@@ -4054,7 +4312,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
public String getExposureTimeString(long exposure_time) {
- double exposure_time_s = exposure_time / 1000000000.0;
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "getExposureTimeString(): " + exposure_time);*/
+ double exposure_time_s = exposure_time/1000000000.0;
String string;
if (exposure_time > 100000000) {
// show exposure times of more than 0.1s directly
@@ -4063,6 +4323,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
double exposure_time_r = 1.0 / exposure_time_s;
string = " 1/" + (int) (exposure_time_r + 0.5) + getResources().getString(R.string.seconds_abbreviation);
}
+ /*if( MyDebug.LOG )
+ Log.d(TAG, "getExposureTimeString() return: " + string);*/
return string;
}
@@ -4249,8 +4511,13 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "fps_ranges not available");
return;
}
- int[] selected_fps = null;
- if (this.is_video) {
+ int [] selected_fps = null;
+ if( camera_controller.isCameraExtension() ) {
+ // don't set preview fps if using camera extension
+ // (important not to return here however - still want to call
+ // camera_controller.clearPreviewFpsRange() to clear a previously set fps)
+ }
+ else if( this.is_video ) {
// For Nexus 5 and Nexus 6, we need to set the preview fps using matchPreviewFpsToVideo to avoid problem of dark preview in low light, as described above.
// When the video recording starts, the preview automatically adjusts, but still good to avoid too-dark preview before the user starts recording.
// However I'm wary of changing the behaviour for all devices at the moment, since some devices can be
@@ -4354,16 +4621,19 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
updateFlashForVideo();
}
- if (!during_startup) {
- String focus_value = current_focus_index != -1 ? supported_focus_values.get(current_focus_index) : null;
- if (MyDebug.LOG)
+ if( !during_startup ) {
+ if( MyDebug.LOG ) {
+ String focus_value = current_focus_index != -1 ? supported_focus_values.get(current_focus_index) : null;
Log.d(TAG, "focus_value is " + focus_value);
+ }
// Although in theory we only need to stop and start preview, which should be faster, reopening the camera allows that to
// run on the background thread, thus not freezing the UI
// Also workaround for bug on Nexus 6 at least where switching to video and back to photo mode causes continuous picture mode to stop -
// at the least, we need to reopen camera when: ( !is_video && focus_value != null && focus_value.equals("focus_mode_continuous_picture") ).
// Lastly, note that it's important to still call setupCamera() when switching between photo and video modes (see comment for setupCamera()).
// So if we ever allow stopping/starting the preview again, we still need to call setupCamera() again.
+ // Update: and even if we want to go back to just stopping/starting the preview, it's likely still a good idea to reopen the camera when
+ // switching from/to vendor camera extensions, otherwise risk of hangs/crashes on at least some devices (see note in MainActivity.updateForSettings)
this.reopenCamera();
}
@@ -4413,7 +4683,11 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "found no existing focus_value");
// here we set the default values for focus mode
// note if updating default focus value for photo mode, also update MainActivityTest.setToDefault()
- updateFocus(is_video ? "focus_mode_continuous_video" : "focus_mode_continuous_picture", true, true, auto_focus);
+ if( !updateFocus(is_video ? "focus_mode_continuous_video" : "focus_mode_continuous_picture", true, true, auto_focus) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "continuous focus not supported, so fall back to first");
+ updateFocus(0, true, true, auto_focus);
+ }
}
}
@@ -4951,8 +5225,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "takePicturePressed exit");
}
- private void takePictureOnTimer(final long timer_delay, @SuppressWarnings("unused") boolean repeated) {
- if (MyDebug.LOG) {
+ private void takePictureOnTimer(final long timer_delay, boolean repeated) {
+ if( MyDebug.LOG ) {
Log.d(TAG, "takePictureOnTimer");
Log.d(TAG, "timer_delay: " + timer_delay);
}
@@ -5079,7 +5353,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
try {
//if( true )
// throw new IOException(); // test
- if (info.video_method == ApplicationInterface.VIDEOMETHOD_FILE) {
+ if( info.video_method == ApplicationInterface.VideoMethod.FILE ) {
video_recorder.setNextOutputFile(new File(info.video_filename));
} else {
video_recorder.setNextOutputFile(info.video_pfd_saf.getFileDescriptor());
@@ -5091,6 +5365,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
} catch (IOException e) {
Log.e(TAG, "failed to setNextOutputFile");
e.printStackTrace();
+ info.close();
}
}
}
@@ -5102,7 +5377,10 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "seamless restart with setNextOutputFile has now occurred");
if (nextVideoFileInfo == null) {
Log.e(TAG, "received MEDIA_RECORDER_INFO_NEXT_OUTPUT_FILE_STARTED but nextVideoFileInfo is null");
- } else {
+ }
+ else {
+ videoFileInfo.close();
+ video_time_last_maxfilesize_restart = getVideoTime(false);
applicationInterface.restartedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
videoFileInfo = nextVideoFileInfo;
nextVideoFileInfo = null;
@@ -5232,14 +5510,15 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
private VideoFileInfo createVideoFile(String extension) {
if (MyDebug.LOG)
Log.d(TAG, "createVideoFile");
+ VideoFileInfo video_file_info = null;
+ ParcelFileDescriptor video_pfd_saf = null;
try {
- int method = applicationInterface.createOutputVideoMethod();
+ ApplicationInterface.VideoMethod method = applicationInterface.createOutputVideoMethod();
Uri video_uri = null;
String video_filename = null;
- ParcelFileDescriptor video_pfd_saf = null;
- if (MyDebug.LOG)
+ if( MyDebug.LOG )
Log.d(TAG, "method? " + method);
- if (method == ApplicationInterface.VIDEOMETHOD_FILE) {
+ if( method == ApplicationInterface.VideoMethod.FILE ) {
/*if( true )
throw new IOException(); // test*/
File videoFile = applicationInterface.createOutputVideoFile(extension);
@@ -5248,9 +5527,13 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "save to: " + video_filename);
} else {
Uri uri;
- if (method == ApplicationInterface.VIDEOMETHOD_SAF) {
+ if( method == ApplicationInterface.VideoMethod.SAF ) {
uri = applicationInterface.createOutputVideoSAF(extension);
- } else {
+ }
+ else if( method == ApplicationInterface.VideoMethod.MEDIASTORE ) {
+ uri = applicationInterface.createOutputVideoMediaStore(extension);
+ }
+ else {
uri = applicationInterface.createOutputVideoUri();
}
if (MyDebug.LOG)
@@ -5259,13 +5542,26 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
video_uri = uri;
}
- return new VideoFileInfo(method, video_uri, video_filename, video_pfd_saf);
- } catch (IOException e) {
- if (MyDebug.LOG)
+ video_file_info = new VideoFileInfo(method, video_uri, video_filename, video_pfd_saf);
+ }
+ catch(IOException e) {
+ if( MyDebug.LOG )
Log.e(TAG, "Couldn't create media video file; check storage permissions?");
e.printStackTrace();
}
- return null;
+ finally {
+ if( video_file_info == null && video_pfd_saf != null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "failed, so clean up video_pfd_saf");
+ try {
+ video_pfd_saf.close();
+ }
+ catch(IOException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+ return video_file_info;
}
/**
@@ -5392,7 +5688,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "actual video_max_duration: " + video_max_duration);
local_video_recorder.setMaxDuration((int) video_max_duration);
- if (videoFileInfo.video_method == ApplicationInterface.VIDEOMETHOD_FILE) {
+ if( videoFileInfo.video_method == ApplicationInterface.VideoMethod.FILE ) {
local_video_recorder.setOutputFile(videoFileInfo.video_filename);
} else {
local_video_recorder.setOutputFile(videoFileInfo.video_pfd_saf.getFileDescriptor());
@@ -5407,14 +5703,33 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
local_video_recorder.setOrientationHint(getImageVideoRotation());
if (MyDebug.LOG)
Log.d(TAG, "about to prepare video recorder");
+
local_video_recorder.prepare();
+ if( test_video_ioexception ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "test_video_ioexception is true");
+ throw new IOException();
+ }
+
boolean want_photo_video_recording = supportsPhotoVideoRecording() && applicationInterface.usePhotoVideoRecording();
+
camera_controller.initVideoRecorderPostPrepare(local_video_recorder, want_photo_video_recording);
- if (MyDebug.LOG)
+ if( test_video_cameracontrollerexception ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "test_video_cameracontrollerexception is true");
+ throw new CameraControllerException();
+ }
+
+ if( MyDebug.LOG )
Log.d(TAG, "about to start video recorder");
try {
local_video_recorder.start();
+ if( test_video_failure ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "test_video_failure is true");
+ throw new RuntimeException();
+ }
this.video_recorder = local_video_recorder;
videoRecordingStarted(max_filesize_restart);
} catch (RuntimeException e) {
@@ -5478,6 +5793,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
video_recorder.release();
video_recorder = null;
video_recorder_is_paused = false;
+ applicationInterface.deleteUnusedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
+ videoFileInfo = new VideoFileInfo();
applicationInterface.cameraInOperation(false, true);
this.reconnectCamera(true);
} catch (CameraControllerException e) {
@@ -5501,6 +5818,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
video_recorder.release();
video_recorder = null;
video_recorder_is_paused = false;
+ applicationInterface.deleteUnusedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
+ videoFileInfo = new VideoFileInfo();
applicationInterface.cameraInOperation(false, true);
this.reconnectCamera(true);
this.showToast(null, R.string.video_no_free_space);
@@ -5521,13 +5840,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
faces_detected = null;
}
- if (test_video_failure) {
- if (MyDebug.LOG)
- Log.d(TAG, "test_video_failure is true");
- throw new RuntimeException();
- }
video_start_time = System.currentTimeMillis();
video_start_time_set = true;
+ video_time_last_maxfilesize_restart = max_filesize_restart ? video_accumulated_time : 0;
applicationInterface.startedVideo();
// Don't send intent for ACTION_MEDIA_SCANNER_SCAN_FILE yet - wait until finished, so we get completed file.
// Don't do any further calls after applicationInterface.startedVideo() that might throw an error - instead video error
@@ -5621,6 +5936,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
video_recorder.release();
video_recorder = null;
video_recorder_is_paused = false;
+ applicationInterface.deleteUnusedVideo(videoFileInfo.video_method, videoFileInfo.video_uri, videoFileInfo.video_filename);
+ videoFileInfo = new VideoFileInfo();
applicationInterface.cameraInOperation(false, true);
this.reconnectCamera(true);
}
@@ -6014,7 +6331,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
if (MyDebug.LOG)
Log.d(TAG, "enable_sound? " + enable_sound);
camera_controller.enableShutterSound(enable_sound);
- if (using_android_l) {
+ if( using_android_l ) {
+ boolean camera2_dummy_capture_hack = applicationInterface.useCamera2DummyCaptureHack();
+ if( MyDebug.LOG )
+ Log.d(TAG, "camera2_dummy_capture_hack? " + camera2_dummy_capture_hack);
+ camera_controller.setDummyCaptureHack( camera2_dummy_capture_hack );
+
boolean use_camera2_fast_burst = applicationInterface.useCamera2FastBurst();
if (MyDebug.LOG)
Log.d(TAG, "use_camera2_fast_burst? " + use_camera2_fast_burst);
@@ -6195,6 +6517,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
String current_ui_focus_value = getCurrentFocusValue();
if (current_ui_focus_value != null && !camera_controller.getFocusValue().equals(current_ui_focus_value) && camera_controller.getFocusValue().equals("focus_mode_auto")) {
camera_controller.cancelAutoFocus();
+ if( MyDebug.LOG )
+ Log.d(TAG, "switch back to: " + current_ui_focus_value);
camera_controller.setFocusValue(current_ui_focus_value);
} else {
if (MyDebug.LOG)
@@ -6823,6 +7147,18 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return this.supports_burst;
}
+ /** Whether the Camera vendor extension is supported (see
+ * https://developer.android.com/reference/android/hardware/camera2/CameraExtensionCharacteristics ).
+ */
+ public boolean supportsCameraExtension(int extension) {
+ if( extension == CameraExtensionCharacteristics.EXTENSION_HDR ) {
+ // blocked for now, as have yet to be able to test this (seems to have no effect on Galaxy S10e;
+ // not available on Pixel 6 Pro)
+ return false;
+ }
+ return this.supported_extensions != null && this.supported_extensions.contains(extension);
+ }
+
public boolean supportsRaw() {
return this.supports_raw;
}
@@ -6920,24 +7256,27 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
/**
- * @param check_supported If true, and a burst mode is in use (fast burst, expo, HDR), and/or
- * a constraint was set via getCameraResolutionPref(), then the returned
- * list will be filtered to remove sizes that don't support burst and/or
- * these constraints.
+ * @param check_supported If true, and a burst mode is in use (fast burst, expo, HDR), or
+ * a camera vendor extension mode, and/o a constraint was set via
+ * getCameraResolutionPref(), then the returned list will be filtered to
+ * remove sizes that don't support burst and/or these constraints.
*/
public List getSupportedPictureSizes(boolean check_supported) {
if (MyDebug.LOG)
Log.d(TAG, "getSupportedPictureSizes");
- boolean is_burst = (camera_controller != null && camera_controller.isBurstOrExpo());
+ boolean is_burst = ( camera_controller != null && camera_controller.isBurstOrExpo() );
+ boolean is_extension = ( camera_controller != null && camera_controller.isCameraExtension() );
+ int extension = is_extension ? camera_controller.getCameraExtension() : -1;
boolean has_constraints = photo_size_constraints != null && photo_size_constraints.hasConstraints();
- if (check_supported && (is_burst || has_constraints)) {
- if (MyDebug.LOG)
- Log.d(TAG, "need to filter picture sizes for burst mode and/or constraints");
+ if( check_supported && ( is_burst || is_extension || has_constraints ) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "need to filter picture sizes for burst mode and/or extension mode and/or constraints");
List filtered_sizes = new ArrayList<>();
- for (CameraController.Size size : photo_sizes) {
- if (is_burst && !size.supports_burst) {
- // burst mode not supported
- } else if (!photo_size_constraints.satisfies(size)) {
+ for(CameraController.Size size : photo_sizes) {
+ if( !size.supportsRequirements(is_burst, is_extension, extension) ) {
+ // burst or extension mode not supported
+ }
+ else if( !photo_size_constraints.satisfies(size) ) {
// doesn't satisfy imposed constraints
} else {
filtered_sizes.add(size);
@@ -7073,11 +7412,14 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return camera_controller.getAPI();
}
+ /** Call when activity is resumed.
+ */
public void onResume() {
if (MyDebug.LOG)
Log.d(TAG, "onResume");
recreatePreviewBitmap();
this.app_is_paused = false;
+ this.is_paused = false;
cameraSurface.onResume();
if (canvasView != null)
canvasView.onResume();
@@ -7097,12 +7439,25 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
}
+ /** Call when activity is paused.
+ */
public void onPause() {
- if (MyDebug.LOG)
+ onPause(true);
+ }
+
+ /** Call when activity is paused, or the application wants to put the Preview into a paused
+ * state (closing the camera etc).
+ * @param activity_is_pausing Set to true if this is called because the activity is being paused;
+ * set to false if the activity is not pausing.
+ */
+ public void onPause(boolean activity_is_pausing) {
+ if( MyDebug.LOG )
Log.d(TAG, "onPause");
- this.app_is_paused = true;
- if (camera_open_state == CameraOpenState.CAMERAOPENSTATE_OPENING) {
- if (MyDebug.LOG)
+ this.is_paused = true;
+ if( activity_is_pausing )
+ this.app_is_paused = true; // note, if activity_is_paused==false, we don't change app_is_paused, in case app was paused indicated via a separate call to onPause
+ if( camera_open_state == CameraOpenState.CAMERAOPENSTATE_OPENING ) {
+ if( MyDebug.LOG )
Log.d(TAG, "cancel open_camera_task");
if (open_camera_task != null) { // just to be safe
this.open_camera_task.cancel(true);
@@ -7178,115 +7533,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "onSaveInstanceState");
}
- private class RotatedTextView extends View {
- private String[] lines;
- private int offset_y;
- private final Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);
- private final Rect bounds = new Rect();
- private final Rect sub_bounds = new Rect();
- private final RectF rect = new RectF();
- private final boolean style_outline; // if true, display text with outline rather than background
-
- RotatedTextView(String text, int offset_y, boolean style_outline, Context context) {
- super(context);
-
- this.lines = text.split("\n");
- this.offset_y = offset_y;
- this.style_outline = style_outline;
-
- if (style_outline) {
- // outline style looks clearer when using bold text
- this.paint.setTypeface(Typeface.create(Typeface.DEFAULT, Typeface.BOLD));
- }
- }
-
- void setText(String text) {
- this.lines = text.split("\n");
- }
-
- void setOffsetY(int offset_y) {
- this.offset_y = offset_y;
- }
-
- @SuppressLint("CanvasSize")
- @Override
- protected void onDraw(Canvas canvas) {
- final float scale = Preview.this.getResources().getDisplayMetrics().density;
- paint.setTextSize(14 * scale + 0.5f); // convert dps to pixels
- if (!style_outline) {
- paint.setShadowLayer(1, 0, 1, Color.BLACK);
- }
- //paint.getTextBounds(text, 0, text.length(), bounds);
- boolean first_line = true;
- for (String line : lines) {
- paint.getTextBounds(line, 0, line.length(), sub_bounds);
- /*if( MyDebug.LOG ) {
- Log.d(TAG, "line: " + line + " sub_bounds: " + sub_bounds);
- }*/
- if (first_line) {
- bounds.set(sub_bounds);
- first_line = false;
- } else {
- bounds.top = Math.min(sub_bounds.top, bounds.top);
- bounds.bottom = Math.max(sub_bounds.bottom, bounds.bottom);
- bounds.left = Math.min(sub_bounds.left, bounds.left);
- bounds.right = Math.max(sub_bounds.right, bounds.right);
- }
- }
- // above we've worked out the maximum bounds of each line - this is useful for left/right, but for the top/bottom
- // we would rather use a consistent height no matter what the text is (otherwise we have the problem of varying
- // gap between lines, depending on what the characters are).
- final String reference_text = "Ap";
- paint.getTextBounds(reference_text, 0, reference_text.length(), sub_bounds);
- bounds.top = sub_bounds.top;
- bounds.bottom = sub_bounds.bottom;
- /*if( MyDebug.LOG ) {
- Log.d(TAG, "bounds: " + bounds);
- }*/
- int height = bounds.bottom - bounds.top; // height of each line
- bounds.bottom += ((lines.length - 1) * height) / 2;
- bounds.top -= ((lines.length - 1) * height) / 2;
- final int padding = (int) (14 * scale + 0.5f); // padding for the shaded rectangle; convert dps to pixels
- canvas.save();
- canvas.rotate(ui_rotation, canvas.getWidth() / 2.0f, canvas.getHeight() / 2.0f);
- float margin = 32.0f;
-
- rect.left = canvas.getWidth() / 2.0f - bounds.width() / 2.0f + bounds.left - padding + margin;
- rect.top = canvas.getHeight() / 2.0f + bounds.top - padding + offset_y;
- rect.right = canvas.getWidth() / 2.0f - bounds.width() / 2.0f + bounds.right + padding + margin;
- rect.bottom = canvas.getHeight() / 2.0f + bounds.bottom + padding + offset_y;
-
- paint.setStyle(Paint.Style.FILL);
- if (!style_outline) {
- paint.setColor(Preview.this.getResources().getColor(R.color.color_toast_bg));
- final float radius = (24 * scale + 0.5f); // convert dps to pixels
- canvas.drawRoundRect(rect, radius, radius, paint);
- }
-
- paint.setColor(Color.WHITE);
- int ypos = canvas.getHeight() / 2 + offset_y - ((lines.length - 1) * height) / 2;
- for (String line : lines) {
- canvas.drawText(line, canvas.getWidth() / 2.0f - bounds.width() / 2.0f + margin, ypos, paint);
-
- if (style_outline) {
- // draw outline
- int current_color = paint.getColor();
- paint.setColor(Color.BLACK);
- paint.setStyle(Paint.Style.STROKE);
- paint.setStrokeWidth(1);
- canvas.drawText(line, canvas.getWidth() / 2.0f - bounds.width() / 2.0f + margin, ypos, paint);
- paint.setStyle(Paint.Style.FILL);
- paint.setColor(current_color);
- }
-
- ypos += height;
- }
- canvas.restore();
- }
- }
-
private final Handler fake_toast_handler = new Handler();
- private RotatedTextView active_fake_toast = null;
+ private TextView active_fake_toast = null;
public void clearActiveFakeToast() {
clearActiveFakeToast(false);
@@ -7391,6 +7639,10 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
final float scale = Preview.this.getResources().getDisplayMetrics().density;
final int offset_y = (int) (offset_y_dp * scale + 0.5f); // convert dps to pixels
+ float shadow_radius = (2.0f * scale + 0.5f); // convert pt to pixels
+ shadow_radius = Math.max(shadow_radius, 1.0f);
+ if( MyDebug.LOG )
+ Log.d(TAG, "shadow_radius: " + shadow_radius);
if (use_fake_toast) {
if (active_fake_toast != null) {
@@ -7398,13 +7650,19 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
if (MyDebug.LOG)
Log.d(TAG, "re-use fake toast: " + active_fake_toast);
active_fake_toast.setText(message);
- active_fake_toast.setOffsetY(offset_y);
+ active_fake_toast.setPadding(0, offset_y, 0, 0);
active_fake_toast.invalidate(); // make sure the view is redrawn
- } else {
- active_fake_toast = new RotatedTextView(message, offset_y, true, activity);
- if (MyDebug.LOG)
- Log.d(TAG, "create new fake toast: " + active_fake_toast);
+ }
+ else {
Activity activity = (Activity) Preview.this.getContext();
+ @SuppressLint("InflateParams") // we add the view a few lines below
+ final View view = LayoutInflater.from(activity).inflate(R.layout.toast_textview, null);
+ active_fake_toast = view.findViewById(R.id.text_view);
+ active_fake_toast.setShadowLayer(shadow_radius, 0.0f, 0.0f, Color.BLACK);
+ active_fake_toast.setPadding(0, offset_y, 0, 0);
+ active_fake_toast.setText(message);
+ if( MyDebug.LOG )
+ Log.d(TAG, "create new fake toast: " + active_fake_toast);
final FrameLayout rootLayout = activity.findViewById(android.R.id.content);
rootLayout.addView(active_fake_toast);
}
@@ -7449,9 +7707,9 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "reuse last toast: " + last_toast);
toast = clear_toast.toast;
// for performance, important to reuse the same view, instead of creating a new one (otherwise we get jerky preview update e.g. for changing manual focus slider)
- RotatedTextView view = (RotatedTextView) toast.getView();
+ TextView view = (TextView)toast.getView();
view.setText(message);
- view.setOffsetY(offset_y);
+ view.setPadding(0, offset_y, 0, 0);
view.invalidate(); // make sure the toast is redrawn
toast.setView(view);
} else {
@@ -7465,8 +7723,14 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
Log.d(TAG, "created new toast: " + toast);
if (clear_toast != null)
clear_toast.toast = toast;
- View text = new RotatedTextView(message, offset_y, false, activity);
+ @SuppressLint("InflateParams") // we add the view to the toast
+ final View view = LayoutInflater.from(activity).inflate(R.layout.toast_textview, null);
+ TextView text = view.findViewById(R.id.text_view);
+ text.setShadowLayer(shadow_radius, 0.0f, 0.0f, Color.BLACK);
+ text.setText(message);
+ view.setPadding(0, offset_y, 0, 0);
toast.setView(text);
+ toast.setGravity(Gravity.CENTER, 0, 0);
last_toast_time_ms = time_now;
}
toast.setDuration(Toast.LENGTH_SHORT);
@@ -7483,8 +7747,8 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
public void setUIRotation(int ui_rotation) {
- if (MyDebug.LOG)
- Log.d(TAG, "setUIRotation");
+ if( MyDebug.LOG )
+ Log.d(TAG, "setUIRotation: " + ui_rotation);
this.ui_rotation = ui_rotation;
}
@@ -7591,8 +7855,11 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
private void recreatePreviewBitmap() {
- if (MyDebug.LOG)
+ if( MyDebug.LOG ) {
Log.d(TAG, "recreatePreviewBitmap");
+ Log.d(TAG, "textureview_w: " + textureview_w);
+ Log.d(TAG, "textureview_h: " + textureview_h);
+ }
freePreviewBitmap();
if (want_preview_bitmap) {
@@ -7966,7 +8233,7 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
// of the device
int rotation_degrees = preview.getDisplayRotationDegrees();
/*if( MyDebug.LOG ) {
- Log.d(TAG, "orientation of display relative to natural orientaton: " + rotation_degrees);
+ Log.d(TAG, "orientation of display relative to natural orientation: " + rotation_degrees);
}*/
if (MyDebug.LOG)
Log.d(TAG, "time before creating new_zebra_stripes_bitmap: " + (System.currentTimeMillis() - debug_time));
@@ -8111,10 +8378,10 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
final int refresh_histogram_rate_ms = 200;
final long refresh_time = (want_zebra_stripes || want_focus_peaking) ? 40 : refresh_histogram_rate_ms;
long time_now = System.currentTimeMillis();
- if (want_preview_bitmap && preview_bitmap != null && Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP &&
- !app_is_paused && !applicationInterface.isPreviewInBackground() &&
- !refreshPreviewBitmapTaskIsRunning() && time_now > last_preview_bitmap_time_ms + refresh_time) {
- if (MyDebug.LOG)
+ if( want_preview_bitmap && preview_bitmap != null && Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP &&
+ !is_paused && !applicationInterface.isPreviewInBackground() &&
+ !refreshPreviewBitmapTaskIsRunning() && time_now > last_preview_bitmap_time_ms + refresh_time ) {
+ if( MyDebug.LOG )
Log.d(TAG, "refreshPreviewBitmap");
// even if we're running the background task at a faster rate (due to zebra stripes etc), we still update the histogram
// at the standard rate
@@ -8158,12 +8425,17 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return isVideoRecording() && video_recorder_is_paused;
}
- public long getVideoTime() {
- if (this.isVideoRecordingPaused()) {
- return video_accumulated_time;
+ /** Returns the time of the current video.
+ * In case of restarting due to max filesize (whether on Android 8+ or not), this includes the
+ * total time of all the previous video files too, unless this_file_only==true;
+ */
+ public long getVideoTime(boolean this_file_only) {
+ long offset = this_file_only ? video_time_last_maxfilesize_restart : 0;
+ if( this.isVideoRecordingPaused() ) {
+ return video_accumulated_time - offset;
}
long time_now = System.currentTimeMillis();
- return time_now - video_start_time + video_accumulated_time;
+ return time_now - video_start_time + video_accumulated_time - offset;
}
public long getVideoAccumulatedTime() {
@@ -8251,7 +8523,12 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
}
public Pair getFocusPos() {
- return new Pair<>(focus_screen_x, focus_screen_y);
+ // note, we don't store the screen coordinates, as they may become out of date in the
+ // screen orientation changes (if MainActivity.lock_to_landscape==false)
+ float [] coords = {focus_camera_x, focus_camera_y};
+ final Matrix matrix = getCameraToPreviewMatrix();
+ matrix.mapPoints(coords);
+ return new Pair<>((int)coords[0], (int)coords[1]);
}
public int getMaxNumFocusAreas() {
@@ -8303,7 +8580,20 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
return this.successfully_focused && System.currentTimeMillis() < this.successfully_focused_time + 5000;
}
- public CameraController.Face[] getFacesDetected() {
+ /** If non-null, this returned array will stored the currently detected faces (if face recognition
+ * is enabled). The face.temp rect will store the face rectangle in screen coordinates.
+ */
+ public CameraController.Face [] getFacesDetected() {
+ if( faces_detected != null && faces_detected.length > 0 ) {
+ // note, we don't store the screen coordinates, as they may become out of date in the
+ // screen orientation changes (if MainActivity.lock_to_landscape==false)
+ final Matrix matrix = getCameraToPreviewMatrix();
+ for(CameraController.Face face : faces_detected) {
+ face_rect.set(face.rect);
+ matrix.mapRect(face_rect);
+ face_rect.round(face.temp);
+ }
+ }
// FindBugs warns about returning the array directly, but in fact we need to return direct access rather than copying, so that the on-screen display of faces rectangles updates
return this.faces_detected;
}
@@ -8317,4 +8607,22 @@ public class Preview implements SurfaceHolder.Callback, TextureView.SurfaceTextu
int zoom_factor = camera_controller.getZoom();
return this.zoom_ratios.get(zoom_factor) / 100.0f;
}
+
+ public float getZoomRatio(int index) {
+ if( zoom_ratios == null )
+ return 1.0f;
+ return this.zoom_ratios.get(index)/100.0f;
+ }
+
+ public float getMinZoomRatio() {
+ if( zoom_ratios == null )
+ return 1.0f;
+ return this.zoom_ratios.get(0)/100.0f;
+ }
+
+ public float getMaxZoomRatio() {
+ if( zoom_ratios == null )
+ return 1.0f;
+ return this.zoom_ratios.get(max_zoom_factor)/100.0f;
+ }
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/preview/VideoProfile.java b/app/src/main/java/net/sourceforge/opencamera/preview/VideoProfile.java
index 8ebe7c1cb5b84d7fd01ee4e8735bdbd820de6ea9..8de628050e4e6fb40eb016ef1763e7c17a33046c 100644
--- a/app/src/main/java/net/sourceforge/opencamera/preview/VideoProfile.java
+++ b/app/src/main/java/net/sourceforge/opencamera/preview/VideoProfile.java
@@ -4,6 +4,8 @@ import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.util.Log;
+import androidx.annotation.NonNull;
+
import net.sourceforge.opencamera.MyDebug;
/** This is essentially similar to CamcorderProfile in that it encapsulates a set of video settings
@@ -54,6 +56,7 @@ public class VideoProfile {
this.videoFrameWidth = camcorderProfile.videoFrameWidth;
}
+ @NonNull
public String toString() {
return ("\nAudioSource: " + this.audioSource +
"\nVideoSource: " + this.videoSource +
diff --git a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothLeService.java b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothLeService.java
index 6cacf161bcbc778a8b2b62075b3c25a3da6ce76c..daaa6b2937e3e8301f89c7a5b9f18ba4ae58c5ff 100644
--- a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothLeService.java
+++ b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothLeService.java
@@ -2,6 +2,7 @@ package net.sourceforge.opencamera.remotecontrol;
import net.sourceforge.opencamera.MyDebug;
+import android.Manifest;
import android.app.Service;
import android.bluetooth.BluetoothAdapter;
import android.bluetooth.BluetoothDevice;
@@ -14,11 +15,14 @@ import android.bluetooth.BluetoothManager;
import android.bluetooth.BluetoothProfile;
import android.content.Context;
import android.content.Intent;
+import android.content.pm.PackageManager;
import android.os.Binder;
import android.os.Build;
import android.os.Handler;
import android.os.IBinder;
import androidx.annotation.RequiresApi;
+import androidx.core.content.ContextCompat;
+
import android.util.Log;
import java.util.ArrayList;
@@ -33,6 +37,7 @@ import java.util.UUID;
public class BluetoothLeService extends Service {
private final static String TAG = "BluetoothLeService";
+ private boolean is_bound; // whether service is bound
private BluetoothManager bluetoothManager;
private BluetoothAdapter bluetoothAdapter;
private String device_address;
@@ -79,10 +84,37 @@ public class BluetoothLeService extends Service {
* Android BLE stack and API (just knowing the MAC is not enough on
* many phones).*/
private void triggerScan() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "triggerScan");
+
+ if( !is_bound ) {
+ // Don't allow calls to startLeScan() (which requires location permission) when service
+ // not bound, as application may be in background!
+ // In theory this shouldn't be needed here, as we also check is_bound in connect(), but
+ // have it here too just to be safe.
+ Log.e(TAG, "triggerScan shouldn't be called when service not bound");
+ return;
+ }
+
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_SCAN) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return;
+ }
+ }
+
// Stops scanning after a pre-defined scan period.
bluetoothHandler.postDelayed(new Runnable() {
@Override
public void run() {
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(BluetoothLeService.this, Manifest.permission.BLUETOOTH_SCAN) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return;
+ }
+ }
bluetoothAdapter.stopLeScan(null);
}
}, 10000);
@@ -105,7 +137,20 @@ public class BluetoothLeService extends Service {
if( MyDebug.LOG ) {
Log.d(TAG, "Connected to GATT server, call discoverServices()");
}
- bluetoothGatt.discoverServices();
+
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ boolean has_bluetooth_permission = true;
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(BluetoothLeService.this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ has_bluetooth_permission = false;
+ }
+ }
+
+ if( has_bluetooth_permission ) {
+ bluetoothGatt.discoverServices();
+ }
+
currentDepth = -1;
currentTemp = -1;
@@ -120,6 +165,13 @@ public class BluetoothLeService extends Service {
}
void attemptReconnect() {
+ if( !is_bound ) {
+ // We check is_bound in connect() itself, but seems pointless to even try if we
+ // know the service is unbound (and if it's later bound again, we'll try connecting
+ // again anyway without needing this).
+ Log.e(TAG, "don't attempt to reconnect when service not bound");
+ }
+
Timer timer = new Timer();
timer.schedule(new TimerTask() {
public void run() {
@@ -299,18 +351,30 @@ public class BluetoothLeService extends Service {
@Override
public IBinder onBind(Intent intent) {
if( MyDebug.LOG )
- Log.d(TAG, "Starting OpenCamera Bluetooth Service");
+ Log.d(TAG, "onBind");
return mBinder;
}
@Override
public boolean onUnbind(Intent intent) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onUnbind");
+ this.is_bound = false;
close();
return super.onUnbind(intent);
}
-
+ /** Only call this after service is bound (from ServiceConnection.onServiceConnected())!
+ */
public boolean initialize() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "initialize");
+
+ // in theory we'd put this in onBind(), to be more symmetric with onUnbind() where we
+ // set to false - but unclear whether onBind() is always called before
+ // ServiceConnection.onServiceConnected().
+ this.is_bound = true;
+
if( bluetoothManager == null ) {
bluetoothManager = (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
if( bluetoothManager == null ) {
@@ -341,6 +405,43 @@ public class BluetoothLeService extends Service {
Log.d(TAG, "address is null");
return false;
}
+ else if( !is_bound ) {
+ // Don't allow calls to startLeScan() via triggerScan() (which requires location
+ // permission) when service not bound, as application may be in background!
+ // And it doesn't seem sensible to even allow connecting if service not bound.
+ // Under normal operation this isn't needed, but there are calls to connect() that can
+ // happen from postDelayed() or TimerTask in this class, so a risk that they call
+ // connect() after the service is unbound!
+ Log.e(TAG, "connect shouldn't be called when service not bound");
+ return false;
+ }
+
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return false;
+ }
+ }
+
+ // test code for infinite looping, seeing if this runs in background:
+ /*if( address.equals("undefined") ) {
+ Handler handler = new Handler();
+ handler.postDelayed(new Runnable() {
+ public void run() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "trying connect again from postdelayed");
+ connect(address);
+ }
+ }, 1000);
+ }
+
+ if( address.equals("undefined") ) {
+ // test - only needed if we've hacked BluetoothRemoteControl.remoteEnabled() to not check for being undefined
+ if( MyDebug.LOG )
+ Log.d(TAG, "address is undefined");
+ return false;
+ }*/
if( address.equals(device_address) && bluetoothGatt != null ) {
bluetoothGatt.disconnect();
@@ -378,6 +479,15 @@ public class BluetoothLeService extends Service {
if( bluetoothGatt == null ) {
return;
}
+
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return;
+ }
+ }
+
bluetoothGatt.close();
bluetoothGatt = null;
}
@@ -394,6 +504,14 @@ public class BluetoothLeService extends Service {
return;
}
+ // Check for Android 12 Bluetooth permission just in case (and for Android lint error)
+ if( DeviceScanner.useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return;
+ }
+ }
+
String uuid = characteristic.getUuid().toString();
bluetoothGatt.setCharacteristicNotification(characteristic, enabled);
if( enabled ) {
diff --git a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothRemoteControl.java b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothRemoteControl.java
index eb25c2fabcaf11c8059cef75027978c8dd2daf35..e6d4a0261a01ec2f7d195767e918fc250cea351d 100644
--- a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothRemoteControl.java
+++ b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/BluetoothRemoteControl.java
@@ -41,10 +41,23 @@ public class BluetoothRemoteControl {
@Override
public void onServiceConnected(ComponentName componentName, IBinder service) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onServiceConnected");
if( Build.VERSION.SDK_INT < Build.VERSION_CODES.JELLY_BEAN_MR2 ) {
// BluetoothLeService requires Android 4.3+
return;
}
+ if( main_activity.isAppPaused() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "but app is now paused");
+ // Unclear if this could happen - possibly if app pauses immediately after starting
+ // the service, but before we connect? In theory we should then unbind the service,
+ // but seems safer not to try to call initialize or connect.
+ // This will mean the BluetoothLeService still thinks it's unbound (is_bound will
+ // be left false), but find, that just means we'll enforce not trying to connect at
+ // a later stage).
+ return;
+ }
bluetoothLeService = ((BluetoothLeService.LocalBinder) service).getService();
if( !bluetoothLeService.initialize() ) {
Log.e(TAG, "Unable to initialize Bluetooth");
@@ -54,8 +67,17 @@ public class BluetoothRemoteControl {
bluetoothLeService.connect(remoteDeviceAddress);
}
+ /** Called when a connection to the Service has been lost. This typically happens when the
+ * process hosting the service has crashed or been killed.
+ * So in particular, note this isn't the inverse to onServiceConnected() - whilst
+ * onServiceConnected is always called (after the service receives onBind()), upon normal
+ * disconnection (after we call unbindService()), the service receives onUnbind(), but
+ * onServiceDisconnected is not called under normal operation.
+ */
@Override
public void onServiceDisconnected(ComponentName componentName) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onServiceDisconnected");
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
public void run() {
@@ -232,7 +254,10 @@ public class BluetoothRemoteControl {
return;
}
Intent gattServiceIntent = new Intent(main_activity, BluetoothLeService.class);
- if( remoteEnabled()) {
+ // Check isAppPaused() just to be safe - in theory shouldn't be needed, but don't want to
+ // start up the service if we're in background! (And we might as well then try to stop the
+ // service instead.)
+ if( !main_activity.isAppPaused() && remoteEnabled() ) {
if( MyDebug.LOG )
Log.d(TAG, "Remote enabled, starting service");
main_activity.bindService(gattServiceIntent, mServiceConnection, Context.BIND_AUTO_CREATE);
@@ -287,6 +312,7 @@ public class BluetoothRemoteControl {
boolean remote_enabled = sharedPreferences.getBoolean(PreferenceKeys.EnableRemote, false);
remoteDeviceType = sharedPreferences.getString(PreferenceKeys.RemoteType, "undefined");
remoteDeviceAddress = sharedPreferences.getString(PreferenceKeys.RemoteName, "undefined");
+ //return remote_enabled; // test - if using this, also need to enable test code in BluetoothLeService.connect()
return remote_enabled && !remoteDeviceAddress.equals("undefined");
}
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/DeviceScanner.java b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/DeviceScanner.java
index ecb2a268458da94bb4bcd781ae2a68bb60f100e5..54aba108549c8da0b8717ee01f24991fb5316661 100644
--- a/app/src/main/java/net/sourceforge/opencamera/remotecontrol/DeviceScanner.java
+++ b/app/src/main/java/net/sourceforge/opencamera/remotecontrol/DeviceScanner.java
@@ -3,7 +3,7 @@ package net.sourceforge.opencamera.remotecontrol;
import android.Manifest;
import android.app.Activity;
import android.app.AlertDialog;
-import android.app.ListActivity;
+//import android.app.ListActivity;
import android.bluetooth.BluetoothAdapter;
import android.bluetooth.BluetoothDevice;
import android.bluetooth.BluetoothManager;
@@ -18,12 +18,14 @@ import android.os.Handler;
import android.preference.PreferenceManager;
import androidx.annotation.NonNull;
import androidx.annotation.RequiresApi;
+import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
+import android.widget.AdapterView;
import android.widget.BaseAdapter;
import android.widget.Button;
import android.widget.ListView;
@@ -37,7 +39,9 @@ import foundation.e.camera.R;
import java.util.ArrayList;
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
-public class DeviceScanner extends ListActivity {
+//public class DeviceScanner extends ListActivity {
+//public class DeviceScanner extends Activity {
+public class DeviceScanner extends AppCompatActivity {
private static final String TAG = "OC-BLEScanner";
private LeDeviceListAdapter leDeviceListAdapter;
private BluetoothAdapter bluetoothAdapter;
@@ -47,6 +51,7 @@ public class DeviceScanner extends ListActivity {
private static final int REQUEST_ENABLE_BT = 1;
private static final int REQUEST_LOCATION_PERMISSIONS = 2;
+ private static final int REQUEST_BLUETOOTHSCANCONNECT_PERMISSIONS = 3;
@Override
public void onCreate(Bundle savedInstanceState) {
@@ -83,42 +88,109 @@ public class DeviceScanner extends ListActivity {
TextView currentRemote = findViewById(R.id.currentRemote);
currentRemote.setText(getResources().getString(R.string.bluetooth_current_remote) + " " + remote_name);
+ }
+
+ @Override
+ public void onContentChanged() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onContentChanged");
+ super.onContentChanged();
+
+ ListView list = findViewById(R.id.list);
+ list.setOnItemClickListener(new AdapterView.OnItemClickListener() {
+ public void onItemClick(AdapterView> parent, View v, int position, long id) {
+ onListItemClick((ListView)parent, v, position, id);
+ }
+ });
}
- private void startScanning() {
+ /** Returns whether we can use the new Android 12 permissions for bluetooth (BLUETOOTH_SCAN,
+ * BLUETOOTH_CONNECT) - if so, we should use these and NOT location permissions.
+ * See https://developer.android.com/guide/topics/connectivity/bluetooth/permissions .
+ */
+ static boolean useAndroid12BluetoothPermissions() {
+ return Build.VERSION.SDK_INT >= Build.VERSION_CODES.S;
+ }
+ private void checkBluetoothEnabled() {
if( MyDebug.LOG )
- Log.d(TAG, "Start scanning");
-
+ Log.d(TAG, "checkBluetoothEnabled");
+ // BLUETOOTH_CONNECT permission is needed for BluetoothAdapter.ACTION_REQUEST_ENABLE.
+ // Callers should have already checked for bluetooth permission, but we have this check
+ // just in case - and also to avoid the Android lint error that we'd get.
+ if( useAndroid12BluetoothPermissions() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "check for bluetooth connect permission");
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth connect permission not granted!");
+ return;
+ }
+ }
if( !bluetoothAdapter.isEnabled() ) {
// fire an intent to display a dialog asking the user to grant permission to enable Bluetooth
+ // n.b., on Android 12 need BLUETOOTH_CONNECT permission for this
+ if( MyDebug.LOG )
+ Log.d(TAG, "request to enable bluetooth");
Intent enableBtIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
startActivityForResult(enableBtIntent, REQUEST_ENABLE_BT);
}
+ }
- leDeviceListAdapter = new LeDeviceListAdapter();
- setListAdapter(leDeviceListAdapter);
+ private void startScanning() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "Start scanning");
// In real life most of bluetooth LE devices associated with location, so without this
// permission the sample shows nothing in most cases
// Also see https://stackoverflow.com/questions/33045581/location-needs-to-be-enabled-for-bluetooth-low-energy-scanning-on-android-6-0
- int permissionCoarse = Build.VERSION.SDK_INT >= 23 ?
- ContextCompat
- .checkSelfPermission(this, Manifest.permission.ACCESS_COARSE_LOCATION) :
- PackageManager.PERMISSION_GRANTED;
+ // Update: on Android 10+, ACCESS_FINE_LOCATION is needed: https://developer.android.com/about/versions/10/privacy/changes#location-telephony-bluetooth-wifi
+ // Update: on Android 12+, we use the new bluetooth permissions instead of location permissions.
+ boolean has_permission = false;
+ if( useAndroid12BluetoothPermissions() ) {
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_SCAN) == PackageManager.PERMISSION_GRANTED
+ &&
+ ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_CONNECT) == PackageManager.PERMISSION_GRANTED
+ ) {
+ has_permission = true;
+ }
+ }
+ else {
+ String permission_needed = Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q ? Manifest.permission.ACCESS_FINE_LOCATION : Manifest.permission.ACCESS_COARSE_LOCATION;
- if( permissionCoarse == PackageManager.PERMISSION_GRANTED ) {
+ int permissionCoarse = Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ?
+ ContextCompat
+ .checkSelfPermission(this, permission_needed) :
+ PackageManager.PERMISSION_GRANTED;
+
+ if( permissionCoarse == PackageManager.PERMISSION_GRANTED ) {
+ has_permission = true;
+ }
+ }
+
+ if( has_permission ) {
+ checkBluetoothEnabled();
+ }
+
+ leDeviceListAdapter = new LeDeviceListAdapter();
+ //setListAdapter(leDeviceListAdapter);
+ ListView list = findViewById(R.id.list);
+ list.setAdapter(leDeviceListAdapter);
+
+ if( has_permission ) {
scanLeDevice(true);
}
else {
- askForLocationPermission();
+ askForDeviceScannerPermission();
}
}
- private void askForLocationPermission() {
+ /** Request permissions needed for bluetooth (BLUETOOTH_SCAN and BLUETOOTH_CONNECT on Android
+ * 12+, else location permission).
+ */
+ private void askForDeviceScannerPermission() {
if( MyDebug.LOG )
- Log.d(TAG, "askForLocationPermission");
+ Log.d(TAG, "askForDeviceScannerPermission");
// n.b., we only need ACCESS_COARSE_LOCATION, but it's simpler to request both to be consistent with Open Camera's
// location permission requests in PermissionHandler. If we only request ACCESS_COARSE_LOCATION here, and later the
// user enables something that needs ACCESS_FINE_LOCATION, Android ends up showing the "rationale" dialog - and once
@@ -127,23 +199,71 @@ public class DeviceScanner extends ListActivity {
// Also note that if we did want to only request ACCESS_COARSE_LOCATION here, we'd need to declare that permission
// explicitly in the AndroidManifest.xml, otherwise the dialog to request permission is never shown (and the permission
// is denied automatically).
- if( ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.ACCESS_FINE_LOCATION) ||
- ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.ACCESS_COARSE_LOCATION) ) {
- // Show an explanation to the user *asynchronously* -- don't block
- // this thread waiting for the user's response! After the user
- // sees the explanation, try again to request the permission.
- showRequestLocationPermissionRationale();
+ // Update: on Android 10+, ACCESS_FINE_LOCATION is needed anyway: https://developer.android.com/about/versions/10/privacy/changes#location-telephony-bluetooth-wifi
+ // Update: on Android 12+, we use the new bluetooth permissions instead of location permissions.
+ if( useAndroid12BluetoothPermissions() ) {
+ if( ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.BLUETOOTH_SCAN) ||
+ ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.BLUETOOTH_CONNECT) ) {
+ // Show an explanation to the user *asynchronously* -- don't block
+ // this thread waiting for the user's response! After the user
+ // sees the explanation, try again to request the permission.
+ showRequestBluetoothScanConnectPermissionRationale();
+ }
+ else {
+ // Can go ahead and request the permission
+ if( MyDebug.LOG )
+ Log.d(TAG, "requesting bluetooth scan/connect permissions...");
+ ActivityCompat.requestPermissions(this,
+ new String[]{Manifest.permission.BLUETOOTH_SCAN, Manifest.permission.BLUETOOTH_CONNECT},
+ REQUEST_BLUETOOTHSCANCONNECT_PERMISSIONS);
+ }
}
else {
- // Can go ahead and request the permission
- if( MyDebug.LOG )
- Log.d(TAG, "requesting location permissions...");
- ActivityCompat.requestPermissions(this,
- new String[]{Manifest.permission.ACCESS_FINE_LOCATION, Manifest.permission.ACCESS_COARSE_LOCATION},
- REQUEST_LOCATION_PERMISSIONS);
+ if( ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.ACCESS_FINE_LOCATION) ||
+ ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.ACCESS_COARSE_LOCATION) ) {
+ // Show an explanation to the user *asynchronously* -- don't block
+ // this thread waiting for the user's response! After the user
+ // sees the explanation, try again to request the permission.
+ showRequestLocationPermissionRationale();
+ }
+ else {
+ // Can go ahead and request the permission
+ if( MyDebug.LOG )
+ Log.d(TAG, "requesting location permissions...");
+ ActivityCompat.requestPermissions(this,
+ new String[]{Manifest.permission.ACCESS_FINE_LOCATION, Manifest.permission.ACCESS_COARSE_LOCATION},
+ REQUEST_LOCATION_PERMISSIONS);
+ }
}
}
+ private void showRequestBluetoothScanConnectPermissionRationale() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "showRequestBluetoothScanConnectPermissionRationale");
+ if( !useAndroid12BluetoothPermissions() ) {
+ // just in case!
+ Log.e(TAG, "shouldn't be requesting bluetooth scan/connect permissions!");
+ return;
+ }
+
+ String [] permissions = new String[]{Manifest.permission.BLUETOOTH_SCAN, Manifest.permission.BLUETOOTH_CONNECT};
+ int message_id = R.string.permission_rationale_bluetooth_scan_connect;
+
+ final String [] permissions_f = permissions;
+ new AlertDialog.Builder(this)
+ .setTitle(R.string.permission_rationale_title)
+ .setMessage(message_id)
+ .setIcon(android.R.drawable.ic_dialog_alert)
+ .setPositiveButton(android.R.string.ok, null)
+ .setOnDismissListener(new DialogInterface.OnDismissListener() {
+ public void onDismiss(DialogInterface dialog) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "requesting permission...");
+ ActivityCompat.requestPermissions(DeviceScanner.this, permissions_f, REQUEST_BLUETOOTHSCANCONNECT_PERMISSIONS);
+ }
+ }).show();
+ }
+
private void showRequestLocationPermissionRationale() {
if( MyDebug.LOG )
Log.d(TAG, "showRequestLocationPermissionRationale");
@@ -176,18 +296,37 @@ public class DeviceScanner extends ListActivity {
@NonNull int[] grantResults) {
if( MyDebug.LOG )
Log.d(TAG, "onRequestPermissionsResult: requestCode " + requestCode);
- //noinspection SwitchStatementWithTooFewBranches
+
+ super.onRequestPermissionsResult(requestCode, permissions, grantResults);
+
switch (requestCode) {
case REQUEST_LOCATION_PERMISSIONS: {
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
if( MyDebug.LOG )
Log.d(TAG, "location permission granted");
+ checkBluetoothEnabled();
scanLeDevice(true);
}
else {
if( MyDebug.LOG )
Log.d(TAG, "location permission denied");
}
+
+ break;
+ }
+ case REQUEST_BLUETOOTHSCANCONNECT_PERMISSIONS: {
+ if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "bluetooth scan/connect permission granted");
+ checkBluetoothEnabled();
+ scanLeDevice(true);
+ }
+ else {
+ if( MyDebug.LOG )
+ Log.d(TAG, "bluetooth scan/connect permission denied");
+ }
+
+ break;
}
}
}
@@ -207,7 +346,7 @@ public class DeviceScanner extends ListActivity {
@Override
protected void onPause() {
if( MyDebug.LOG )
- Log.d(TAG, "pause...");
+ Log.d(TAG, "onPause");
super.onPause();
if( is_scanning ) {
scanLeDevice(false);
@@ -216,6 +355,33 @@ public class DeviceScanner extends ListActivity {
}
@Override
+ protected void onStop() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onStop");
+ super.onStop();
+
+ // we do this in onPause, but done here again just to be certain!
+ if( is_scanning ) {
+ scanLeDevice(false);
+ leDeviceListAdapter.clear();
+ }
+ }
+
+ @Override
+ protected void onDestroy() {
+ if( MyDebug.LOG )
+ Log.d(TAG, "onDestroy");
+
+ // we do this in onPause, but done here again just to be certain!
+ if( is_scanning ) {
+ scanLeDevice(false);
+ leDeviceListAdapter.clear();
+ }
+
+ super.onDestroy();
+ }
+
+ //@Override
protected void onListItemClick(ListView l, View v, int position, long id) {
final BluetoothDevice device = leDeviceListAdapter.getDevice(position);
if( device == null )
@@ -235,14 +401,31 @@ public class DeviceScanner extends ListActivity {
private void scanLeDevice(final boolean enable) {
if( MyDebug.LOG )
Log.d(TAG, "scanLeDevice: " + enable);
+
+ // BLUETOOTH_SCAN permission is needed for bluetoothAdapter.startLeScan and
+ // bluetoothAdapter.stopLeScan. Callers should have already checked for bluetooth
+ // permission, but we have this check just in case - and also to avoid the Android lint
+ // error that we'd get.
+ if( useAndroid12BluetoothPermissions() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "check for bluetooth scan permission");
+ if( ContextCompat.checkSelfPermission(this, Manifest.permission.BLUETOOTH_SCAN) != PackageManager.PERMISSION_GRANTED ) {
+ Log.e(TAG, "bluetooth scan permission not granted!");
+ return;
+ }
+ }
+
if( enable ) {
// stop scanning after certain time
bluetoothHandler.postDelayed(new Runnable() {
@Override
public void run() {
- is_scanning = false;
+ if( MyDebug.LOG )
+ Log.d(TAG, "stop scanning after delay");
+ /*is_scanning = false;
bluetoothAdapter.stopLeScan(mLeScanCallback);
- invalidateOptionsMenu();
+ invalidateOptionsMenu();*/
+ scanLeDevice(false);
}
}, 10000);
@@ -309,12 +492,32 @@ public class DeviceScanner extends ListActivity {
viewHolder = (ViewHolder) view.getTag();
}
+ // BLUETOOTH_CONNECT permission is needed for device.getName. In theory we shouldn't
+ // have added to this list if bluetooth permission not available, but we have this
+ // check just in case - and also to avoid the Android lint error that we'd get.
+ boolean has_bluetooth_scan_permission = true;
+ if( useAndroid12BluetoothPermissions() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "check for bluetooth connect permission");
+ if( ContextCompat.checkSelfPermission(DeviceScanner.this, Manifest.permission.BLUETOOTH_CONNECT) != PackageManager.PERMISSION_GRANTED ) {
+ has_bluetooth_scan_permission = false;
+ }
+ }
+
BluetoothDevice device = mLeDevices.get(i);
- final String deviceName = device.getName();
- if( deviceName != null && deviceName.length() > 0 )
- viewHolder.deviceName.setText(deviceName);
- else
- viewHolder.deviceName.setText(R.string.unknown_device);
+
+ if( !has_bluetooth_scan_permission ) {
+ Log.e(TAG, "bluetooth connect permission not granted!");
+ viewHolder.deviceName.setText(R.string.unknown_device_no_permission);
+ }
+ else {
+ final String deviceName = device.getName();
+ if( deviceName != null && deviceName.length() > 0 )
+ viewHolder.deviceName.setText(deviceName);
+ else
+ viewHolder.deviceName.setText(R.string.unknown_device);
+ }
+
viewHolder.deviceAddress.setText(device.getAddress());
return view;
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/DrawPreview.java b/app/src/main/java/net/sourceforge/opencamera/ui/DrawPreview.java
index 717d5111af09abf1226c347dae4859bde0af9178..41df473781080b5eaa3a3f1e9604c0cc5c0f4c51 100644
--- a/app/src/main/java/net/sourceforge/opencamera/ui/DrawPreview.java
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/DrawPreview.java
@@ -1,6 +1,5 @@
package net.sourceforge.opencamera.ui;
-import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.text.DateFormat;
@@ -42,10 +41,8 @@ import android.graphics.RectF;
import android.graphics.Typeface;
import android.graphics.drawable.Drawable;
import android.location.Location;
-import android.media.ExifInterface;
import android.net.Uri;
import android.os.BatteryManager;
-import android.os.Build;
import android.preference.PreferenceManager;
import android.util.Log;
import android.util.Pair;
@@ -62,6 +59,8 @@ public class DrawPreview {
private final MainActivity main_activity;
private final MyApplicationInterface applicationInterface;
+ private boolean cover_preview; // whether to cover the preview for Camera2 API
+
// store to avoid calling PreferenceManager.getDefaultSharedPreferences() repeatedly
private final SharedPreferences sharedPreferences;
@@ -169,9 +168,13 @@ public class DrawPreview {
private Bitmap hdr_bitmap;
private Bitmap panorama_bitmap;
private Bitmap expo_bitmap;
- private Bitmap focus_bracket_bitmap;
+ //private Bitmap focus_bracket_bitmap;
+ // no longer bother with a focus bracketing icon - hard to come up with a clear icon, and should be obvious from the two on-screen seekbars
private Bitmap burst_bitmap;
private Bitmap nr_bitmap;
+ private Bitmap x_night_bitmap;
+ private Bitmap x_bokeh_bitmap;
+ private Bitmap x_beauty_bitmap;
private Bitmap photostamp_bitmap;
private Drawable flash_drawable;
private Drawable face_detection_drawable;
@@ -189,6 +192,7 @@ public class DrawPreview {
private Bitmap last_thumbnail; // thumbnail of last picture taken
private volatile boolean thumbnail_anim; // whether we are displaying the thumbnail animation; must be volatile for test project reading the state
private long thumbnail_anim_start_ms = -1; // time that the thumbnail animation started
+ public volatile int test_thumbnail_anim_count;
private final RectF thumbnail_anim_src_rect = new RectF();
private final RectF thumbnail_anim_dst_rect = new RectF();
private final Matrix thumbnail_anim_matrix = new Matrix();
@@ -221,7 +225,7 @@ public class DrawPreview {
private float view_angle_y_preview;
private long last_view_angles_time;
- private int take_photo_top; // coordinate (in canvas x coordinates) of top of the take photo icon
+ private int take_photo_top; // coordinate (in canvas x coordinates, or y coords if system_orientation_portrait==true) of top of the take photo icon
private long last_take_photo_top_time;
private int top_icon_shift; // shift that may be needed for on-screen text to avoid clashing with icons (when arranged "along top")
@@ -249,6 +253,7 @@ public class DrawPreview {
p.setTypeface(Typeface.create(Typeface.DEFAULT, Typeface.BOLD));
p.setStrokeCap(Paint.Cap.ROUND);
scale = getContext().getResources().getDisplayMetrics().density;
+ //noinspection PointlessArithmeticExpression
this.stroke_width = (1.0f * scale + 0.5f); // convert dps to pixels
p.setStrokeWidth(this.stroke_width);
@@ -261,9 +266,12 @@ public class DrawPreview {
hdr_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.ic_hdr_on_white_48dp);
panorama_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.baseline_panorama_horizontal_white_48);
expo_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.expo_icon);
- focus_bracket_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.focus_bracket_icon);
+ //focus_bracket_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.focus_bracket_icon);
burst_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.ic_burst_mode_white_48dp);
nr_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.nr_icon);
+ x_night_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.baseline_bedtime_white_48);
+ x_bokeh_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.baseline_portrait_white_48);
+ x_beauty_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.baseline_face_retouching_natural_white_48);
photostamp_bitmap = BitmapFactory.decodeResource(getContext().getResources(), R.drawable.ic_text_format_white_48dp);
flash_drawable = getContext().getResources().getDrawable(R.drawable.ic_camera_flash_on);
face_detection_drawable = getContext().getResources().getDrawable(R.drawable.ic_face);
@@ -316,10 +324,10 @@ public class DrawPreview {
expo_bitmap.recycle();
expo_bitmap = null;
}
- if( focus_bracket_bitmap != null ) {
+ /*if( focus_bracket_bitmap != null ) {
focus_bracket_bitmap.recycle();
focus_bracket_bitmap = null;
- }
+ }*/
if( burst_bitmap != null ) {
burst_bitmap.recycle();
burst_bitmap = null;
@@ -328,6 +336,18 @@ public class DrawPreview {
nr_bitmap.recycle();
nr_bitmap = null;
}
+ if( x_night_bitmap != null ) {
+ x_night_bitmap.recycle();
+ x_night_bitmap = null;
+ }
+ if( x_bokeh_bitmap != null ) {
+ x_bokeh_bitmap.recycle();
+ x_bokeh_bitmap = null;
+ }
+ if( x_beauty_bitmap != null ) {
+ x_beauty_bitmap.recycle();
+ x_beauty_bitmap = null;
+ }
if( photostamp_bitmap != null ) {
photostamp_bitmap.recycle();
photostamp_bitmap = null;
@@ -377,18 +397,34 @@ public class DrawPreview {
* *after* applying the rotation, when we want the top left of the icon as shown on screen.
* This should not be called every frame but instead should be cached, due to cost of calling
* view.getLocationOnScreen().
+ * Update: For supporting landscape and portrait (if MainActivity.lock_to_landscape==false),
+ * instead this returns the top side if in portrait. Note though we still need to take rotation
+ * into account, as we still apply rotation to the icons when changing orienations (e.g., this
+ * is needed when rotating from reverse landscape to portrait, for on-screen text like level
+ * angle to be offset correctly above the shutter button (see take_photo_top) when the preview
+ * has a wide aspect ratio.
*/
private int getViewOnScreenX(View view) {
view.getLocationOnScreen(gui_location);
- int xpos = gui_location[0];
+
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
+ int xpos = gui_location[system_orientation_portrait ? 1 : 0];
int rotation = Math.round(view.getRotation());
// rotation can be outside [0, 359] if the user repeatedly rotates in same direction!
rotation = (rotation % 360 + 360) % 360; // version of (rotation % 360) that work if rotation is -ve
/*if( MyDebug.LOG )
Log.d(TAG, " mod rotation: " + rotation);*/
- if( rotation == 180 || rotation == 90 ) {
- // annoying behaviour that getLocationOnScreen takes the rotation into account
- xpos -= view.getWidth();
+ // undo annoying behaviour that getLocationOnScreen takes the rotation into account
+ if( system_orientation_portrait ) {
+ if( rotation == 180 || rotation == 270 ) {
+ xpos -= view.getHeight();
+ }
+ }
+ else {
+ if( rotation == 90 || rotation == 180 ) {
+ xpos -= view.getWidth();
+ }
}
return xpos;
}
@@ -404,6 +440,9 @@ public class DrawPreview {
Log.d(TAG, "thumbnail_anim started");
thumbnail_anim = true;
thumbnail_anim_start_ms = System.currentTimeMillis();
+ test_thumbnail_anim_count++;
+ if( MyDebug.LOG )
+ Log.d(TAG, "test_thumbnail_anim_count is now: " + test_thumbnail_anim_count);
}
Bitmap old_thumbnail = this.last_thumbnail;
this.last_thumbnail = thumbnail;
@@ -586,8 +625,7 @@ public class DrawPreview {
}
Uri uri = Uri.parse(ghost_selected_image_pref);
try {
- File file = main_activity.getStorageUtils().getFileFromDocumentUriSAF(uri, false);
- ghost_selected_image_bitmap = loadBitmap(uri, file);
+ ghost_selected_image_bitmap = loadBitmap(uri);
}
catch(IOException e) {
Log.e(TAG, "failed to load ghost_selected_image uri: " + uri);
@@ -656,7 +694,7 @@ public class DrawPreview {
last_take_photo_top_time = 0; // force take_photo_top to be recomputed
last_top_icon_shift_time = 0; // for top_icon_shift to be recomputed
- focus_seekbars_margin_left = -1; // just in case??
+ focus_seekbars_margin_left = -1; // needed as the focus seekbars can only be updated when visible
has_settings = true;
}
@@ -675,11 +713,10 @@ public class DrawPreview {
}
}
- /** Loads the bitmap from the uri. File is optional, and is used on pre-Android 7 devices to
- * read the exif orientation.
+ /** Loads the bitmap from the uri.
* The image will be downscaled if required to be comparable to the preview width.
*/
- private Bitmap loadBitmap(Uri uri, File file) throws IOException {
+ private Bitmap loadBitmap(Uri uri) throws IOException {
if( MyDebug.LOG )
Log.d(TAG, "loadBitmap: " + uri);
Bitmap bitmap;
@@ -748,59 +785,7 @@ public class DrawPreview {
// now need to take exif orientation into account, as some devices or camera apps store the orientation in the exif tag,
// which getBitmap() doesn't account for
- ExifInterface exif = null;
- if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N ) {
- // better to use the Uri from Android 7, so this works when images are shared to Vibrance
- try( InputStream inputStream = main_activity.getContentResolver().openInputStream(uri) ) {
- exif = new ExifInterface(inputStream);
- }
- }
- else {
- if( file != null ) {
- exif = new ExifInterface(file.getAbsolutePath());
- }
- }
- if( exif != null ) {
- int exif_orientation_s = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_UNDEFINED);
- boolean needs_tf = false;
- int exif_orientation = 0;
- // see http://jpegclub.org/exif_orientation.html
- // and http://stackoverflow.com/questions/20478765/how-to-get-the-correct-orientation-of-the-image-selected-from-the-default-image
- if( exif_orientation_s == ExifInterface.ORIENTATION_UNDEFINED || exif_orientation_s == ExifInterface.ORIENTATION_NORMAL ) {
- // leave unchanged
- }
- else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_180 ) {
- needs_tf = true;
- exif_orientation = 180;
- }
- else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_90 ) {
- needs_tf = true;
- exif_orientation = 90;
- }
- else if( exif_orientation_s == ExifInterface.ORIENTATION_ROTATE_270 ) {
- needs_tf = true;
- exif_orientation = 270;
- }
- else {
- // just leave unchanged for now
- if( MyDebug.LOG )
- Log.e(TAG, " unsupported exif orientation: " + exif_orientation_s);
- }
- if( MyDebug.LOG )
- Log.d(TAG, " exif orientation: " + exif_orientation);
-
- if( needs_tf ) {
- if( MyDebug.LOG )
- Log.d(TAG, " need to rotate bitmap due to exif orientation tag");
- Matrix m = new Matrix();
- m.setRotate(exif_orientation, bitmap.getWidth() * 0.5f, bitmap.getHeight() * 0.5f);
- Bitmap rotated_bitmap = Bitmap.createBitmap(bitmap, 0, 0,bitmap.getWidth(), bitmap.getHeight(), m, true);
- if( rotated_bitmap != bitmap ) {
- bitmap.recycle();
- bitmap = rotated_bitmap;
- }
- }
- }
+ bitmap = main_activity.rotateForExif(bitmap, uri);
return bitmap;
}
@@ -984,10 +969,9 @@ public class DrawPreview {
canvas.drawLine(0.0f, 0.0f, canvas.getHeight() - 1.0f, canvas.getHeight() - 1.0f, p);
canvas.drawLine(canvas.getHeight() - 1.0f, 0.0f, 0.0f, canvas.getHeight() - 1.0f, p);
int diff = canvas.getWidth() - canvas.getHeight();
- if (diff > 0) {
- canvas.drawLine(diff, 0.0f, diff + canvas.getHeight() - 1.0f, canvas.getHeight() - 1.0f, p);
- canvas.drawLine(diff + canvas.getHeight() - 1.0f, 0.0f, diff, canvas.getHeight() - 1.0f, p);
- }
+ // n.b., diff is -ve in portrait orientation
+ canvas.drawLine(diff, 0.0f, diff + canvas.getHeight() - 1.0f, canvas.getHeight() - 1.0f, p);
+ canvas.drawLine(diff + canvas.getHeight() - 1.0f, 0.0f, diff, canvas.getHeight() - 1.0f, p);
break;
}
}
@@ -1037,37 +1021,47 @@ public class DrawPreview {
crop_ratio = 2.4;
break;
}
- // we should compare to getCurrentPreviewAspectRatio() not getCurrentPreviewAspectRatio(), as the actual preview
- // aspect ratio may differ to the requested photo/video resolution's aspect ratio, in which case it's still useful
- // to display the crop guide
- if( crop_ratio > 0.0 && Math.abs(preview.getCurrentPreviewAspectRatio() - crop_ratio) > 1.0e-5 ) {
- /*if( MyDebug.LOG ) {
- Log.d(TAG, "crop_ratio: " + crop_ratio);
- Log.d(TAG, "preview_targetRatio: " + preview_targetRatio);
- Log.d(TAG, "canvas width: " + canvas.getWidth());
- Log.d(TAG, "canvas height: " + canvas.getHeight());
- }*/
- int left = 1, top = 1, right = canvas.getWidth()-1, bottom = canvas.getHeight()-1;
- if( crop_ratio > preview.getTargetRatio() ) {
- // crop ratio is wider, so we have to crop top/bottom
- double new_hheight = ((double)canvas.getWidth()) / (2.0f*crop_ratio);
- top = (canvas.getHeight()/2 - (int)new_hheight);
- bottom = (canvas.getHeight()/2 + (int)new_hheight);
+ if( crop_ratio > 0.0 ) {
+ // we should compare to getCurrentPreviewAspectRatio() not getTargetRatio(), as the actual preview
+ // aspect ratio may differ to the requested photo/video resolution's aspect ratio, in which case it's still useful
+ // to display the crop guide
+ double preview_aspect_ratio = preview.getCurrentPreviewAspectRatio();
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
+ if( system_orientation_portrait ) {
+ // crop ratios are always drawn as if in landscape
+ crop_ratio = 1.0/crop_ratio;
+ preview_aspect_ratio = 1.0/preview_aspect_ratio;
}
- else {
- // crop ratio is taller, so we have to crop left/right
- double new_hwidth = (((double)canvas.getHeight()) * crop_ratio) / 2.0f;
- left = (canvas.getWidth()/2 - (int)new_hwidth);
- right = (canvas.getWidth()/2 + (int)new_hwidth);
+ if( Math.abs(preview_aspect_ratio - crop_ratio) > 1.0e-5 ) {
+ /*if( MyDebug.LOG ) {
+ Log.d(TAG, "crop_ratio: " + crop_ratio);
+ Log.d(TAG, "preview_aspect_ratio: " + preview_aspect_ratio);
+ Log.d(TAG, "canvas width: " + canvas.getWidth());
+ Log.d(TAG, "canvas height: " + canvas.getHeight());
+ }*/
+ int left = 1, top = 1, right = canvas.getWidth()-1, bottom = canvas.getHeight()-1;
+ if( crop_ratio > preview_aspect_ratio ) {
+ // crop ratio is wider, so we have to crop top/bottom
+ double new_hheight = ((double)canvas.getWidth()) / (2.0f*crop_ratio);
+ top = (canvas.getHeight()/2 - (int)new_hheight);
+ bottom = (canvas.getHeight()/2 + (int)new_hheight);
+ }
+ else {
+ // crop ratio is taller, so we have to crop left/right
+ double new_hwidth = (((double)canvas.getHeight()) * crop_ratio) / 2.0f;
+ left = (canvas.getWidth()/2 - (int)new_hwidth);
+ right = (canvas.getWidth()/2 + (int)new_hwidth);
+ }
+ canvas.drawRect(left, top, right, bottom, p);
}
- canvas.drawRect(left, top, right, bottom, p);
}
p.setStyle(Paint.Style.FILL); // reset
}
}
}
- private void onDrawInfoLines(Canvas canvas, final int top_x, final int top_y, final int bottom_y, long time_ms) {
+ private void onDrawInfoLines(Canvas canvas, final int top_x, final int top_y, final int bottom_y, final int device_ui_rotation, long time_ms) {
Preview preview = main_activity.getPreview();
CameraController camera_controller = preview.getCameraController();
int ui_rotation = preview.getUIRotation();
@@ -1081,15 +1075,16 @@ public class DrawPreview {
final int gap_y = (int) (0 * scale + 0.5f); // convert dps to pixels
final int icon_gap_y = (int) (2 * scale + 0.5f); // convert dps to pixels
if( ui_rotation == 90 || ui_rotation == 270 ) {
+ // n.b., this is only for when lock_to_landscape==true, so we don't look at device_ui_rotation
int diff = canvas.getWidth() - canvas.getHeight();
location_x += diff/2;
location_y -= diff/2;
}
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
location_y = canvas.getHeight() - location_y - (int) (20 * scale + 0.5f);
}
boolean align_right = false;
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x = canvas.getWidth() - location_x;
p.setTextAlign(Paint.Align.RIGHT);
align_right = true;
@@ -1155,7 +1150,7 @@ public class DrawPreview {
first_line_height = Math.max(first_line_height, height);
}
// update location_y for first line (time and camera id)
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
// upside-down portrait
location_y -= first_line_height;
}
@@ -1190,7 +1185,7 @@ public class DrawPreview {
}
int height = applicationInterface.drawTextWithBackground(canvas, p, free_memory_gb_string, Color.WHITE, Color.BLACK, location_x, location_y, MyApplicationInterface.Alignment.ALIGNMENT_TOP, null, MyApplicationInterface.Shadow.SHADOW_OUTLINE, text_bounds_free_memory);
height += gap_y;
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
location_y -= height;
}
else {
@@ -1274,7 +1269,7 @@ public class DrawPreview {
height += gap_y;
// only move location_y if we actually print something (because on old camera API, even if the ISO option has
// been enabled, we'll never be able to display the on-screen ISO)
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
location_y -= height;
}
else {
@@ -1291,7 +1286,7 @@ public class DrawPreview {
int location_x2 = location_x - flash_padding;
final int icon_size = (int) (16 * scale + 0.5f); // convert dps to pixels
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 = location_x - icon_size + flash_padding;
}
@@ -1318,7 +1313,7 @@ public class DrawPreview {
canvas.drawBitmap(location_off_bitmap, null, icon_dest, p);
}
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1340,7 +1335,7 @@ public class DrawPreview {
p.setAlpha(255);
canvas.drawBitmap(is_raw_only_pref ? raw_only_bitmap : raw_jpeg_bitmap, null, icon_dest, p);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1357,7 +1352,7 @@ public class DrawPreview {
p.setAlpha(255);
face_detection_drawable.draw(canvas);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1374,7 +1369,7 @@ public class DrawPreview {
p.setAlpha(255);
auto_stabilise_drawable.draw(canvas);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1387,9 +1382,12 @@ public class DrawPreview {
photoMode == MyApplicationInterface.PhotoMode.HDR ||
photoMode == MyApplicationInterface.PhotoMode.Panorama ||
photoMode == MyApplicationInterface.PhotoMode.ExpoBracketing ||
- photoMode == MyApplicationInterface.PhotoMode.FocusBracketing ||
+ //photoMode == MyApplicationInterface.PhotoMode.FocusBracketing ||
photoMode == MyApplicationInterface.PhotoMode.FastBurst ||
- photoMode == MyApplicationInterface.PhotoMode.NoiseReduction
+ photoMode == MyApplicationInterface.PhotoMode.NoiseReduction ||
+ photoMode == MyApplicationInterface.PhotoMode.X_Night ||
+ photoMode == MyApplicationInterface.PhotoMode.X_Bokeh ||
+ photoMode == MyApplicationInterface.PhotoMode.X_Beauty
) &&
!applicationInterface.isVideoPref() ) { // these photo modes not supported for video mode
icon_dest.set(location_x2, location_y, location_x2 + icon_size, location_y + icon_size);
@@ -1402,9 +1400,12 @@ public class DrawPreview {
photoMode == MyApplicationInterface.PhotoMode.HDR ? hdr_bitmap :
photoMode == MyApplicationInterface.PhotoMode.Panorama ? panorama_bitmap :
photoMode == MyApplicationInterface.PhotoMode.ExpoBracketing ? expo_bitmap :
- photoMode == MyApplicationInterface.PhotoMode.FocusBracketing ? focus_bracket_bitmap :
+ //photoMode == MyApplicationInterface.PhotoMode.FocusBracketing ? focus_bracket_bitmap :
photoMode == MyApplicationInterface.PhotoMode.FastBurst ? burst_bitmap :
photoMode == MyApplicationInterface.PhotoMode.NoiseReduction ? nr_bitmap :
+ photoMode == MyApplicationInterface.PhotoMode.X_Night ? x_night_bitmap :
+ photoMode == MyApplicationInterface.PhotoMode.X_Bokeh ? x_bokeh_bitmap :
+ photoMode == MyApplicationInterface.PhotoMode.X_Beauty ? x_beauty_bitmap :
null;
if( bitmap != null ) {
if( photoMode == MyApplicationInterface.PhotoMode.NoiseReduction && applicationInterface.getNRModePref() == ApplicationInterface.NRModePref.NRMODE_LOW_LIGHT ) {
@@ -1413,7 +1414,7 @@ public class DrawPreview {
canvas.drawBitmap(bitmap, null, icon_dest, p);
p.setColorFilter(null);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1422,6 +1423,7 @@ public class DrawPreview {
}
}
+
// photo-stamp is supported for photos taken in video mode
// but it isn't supported in RAW-only mode
if( has_stamp_pref && !( is_raw_only_pref && preview.supportsRaw() ) ) {
@@ -1433,7 +1435,7 @@ public class DrawPreview {
p.setAlpha(255);
canvas.drawBitmap(photostamp_bitmap, null, icon_dest, p);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1450,7 +1452,7 @@ public class DrawPreview {
p.setAlpha(255);
canvas.drawBitmap(audio_disabled_bitmap, null, icon_dest, p);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1468,7 +1470,7 @@ public class DrawPreview {
p.setAlpha(255);
canvas.drawBitmap(capture_rate_factor < 1.0f ? slow_motion_bitmap : time_lapse_bitmap, null, icon_dest, p);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1484,7 +1486,7 @@ public class DrawPreview {
p.setAlpha(255);
canvas.drawBitmap(high_speed_fps_bitmap, null, icon_dest, p);
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 -= icon_size + flash_padding;
}
else {
@@ -1532,7 +1534,7 @@ public class DrawPreview {
needs_flash_time = -1;
}
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
location_y -= icon_gap_y;
}
else {
@@ -1552,11 +1554,11 @@ public class DrawPreview {
// n.b., if changing the histogram_height, remember to update focus_seekbar and
// focus_bracketing_target_seekbar margins in activity_main.xml
int location_x2 = location_x - flash_padding;
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
location_x2 = location_x - histogram_width + flash_padding;
}
icon_dest.set(location_x2 - flash_padding, location_y, location_x2 - flash_padding + histogram_width, location_y + histogram_height);
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
icon_dest.top -= histogram_height;
icon_dest.bottom -= histogram_height;
}
@@ -1672,7 +1674,7 @@ public class DrawPreview {
/** This includes drawing of the UI that requires the canvas to be rotated according to the preview's
* current UI rotation.
*/
- private void drawUI(Canvas canvas, long time_ms) {
+ private void drawUI(Canvas canvas, int device_ui_rotation, long time_ms) {
Preview preview = main_activity.getPreview();
CameraController camera_controller = preview.getCameraController();
int ui_rotation = preview.getUIRotation();
@@ -1681,6 +1683,8 @@ public class DrawPreview {
double level_angle = preview.getLevelAngle();
boolean has_geo_direction = preview.hasGeoDirection();
double geo_direction = preview.getGeoDirection();
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
int text_base_y = 0;
canvas.save();
@@ -1689,23 +1693,24 @@ public class DrawPreview {
if( camera_controller != null && !preview.isPreviewPaused() ) {
/*canvas.drawText("PREVIEW", canvas.getWidth() / 2,
canvas.getHeight() / 2, p);*/
+
int gap_y = (int) (20 * scale + 0.5f); // convert dps to pixels
int text_y = (int) (16 * scale + 0.5f); // convert dps to pixels
boolean avoid_ui = false;
// fine tuning to adjust placement of text with respect to the GUI, depending on orientation
- if( ui_placement == MainUI.UIPlacement.UIPLACEMENT_TOP && ( ui_rotation == 0 || ui_rotation == 180 ) ) {
+ if( ui_placement == MainUI.UIPlacement.UIPLACEMENT_TOP && ( device_ui_rotation == 0 || device_ui_rotation == 180 ) ) {
text_base_y = canvas.getHeight() - (int)(0.1*gap_y);
avoid_ui = true;
}
- else if( ui_rotation == ( ui_placement == MainUI.UIPlacement.UIPLACEMENT_RIGHT ? 0 : 180 ) ) {
+ else if( device_ui_rotation == ( ui_placement == MainUI.UIPlacement.UIPLACEMENT_RIGHT ? 0 : 180 ) ) {
text_base_y = canvas.getHeight() - (int)(0.1*gap_y);
avoid_ui = true;
}
- else if( ui_rotation == ( ui_placement == MainUI.UIPlacement.UIPLACEMENT_RIGHT ? 180 : 0 ) ) {
+ else if( device_ui_rotation == ( ui_placement == MainUI.UIPlacement.UIPLACEMENT_RIGHT ? 180 : 0 ) ) {
text_base_y = canvas.getHeight() - (int)(2.5*gap_y); // leave room for GUI icons
}
- else if( ui_rotation == 90 || ui_rotation == 270 ) {
- // ui_rotation 90 is upside down portrait
+ else if( device_ui_rotation == 90 || device_ui_rotation == 270 ) {
+ // 90 is upside down portrait
// 270 is portrait
if( last_take_photo_top_time == 0 || time_ms > last_take_photo_top_time + 1000 ) {
@@ -1716,11 +1721,12 @@ public class DrawPreview {
// align with "top" of the take_photo button, but remember to take the rotation into account!
int view_left = getViewOnScreenX(view);
preview.getView().getLocationOnScreen(gui_location);
- int this_left = gui_location[0];
+ int this_left = gui_location[system_orientation_portrait ? 1 : 0];
take_photo_top = view_left - this_left;
last_take_photo_top_time = time_ms;
/*if( MyDebug.LOG ) {
+ Log.d(TAG, "device_ui_rotation: " + device_ui_rotation);
Log.d(TAG, "view_left: " + view_left);
Log.d(TAG, "this_left: " + this_left);
Log.d(TAG, "take_photo_top: " + take_photo_top);
@@ -1728,7 +1734,9 @@ public class DrawPreview {
}
// diff_x is the difference from the centre of the canvas to the position we want
- int diff_x = take_photo_top - canvas.getWidth()/2;
+ int max_x = system_orientation_portrait ? canvas.getHeight() : canvas.getWidth();
+ int mid_x = max_x/2;
+ int diff_x = take_photo_top - mid_x;
/*if( MyDebug.LOG ) {
Log.d(TAG, "view left: " + view_left);
@@ -1747,8 +1755,7 @@ public class DrawPreview {
int diff_x = preview.getView().getRootView().getRight()/2 - offset_x;
*/
- int max_x = canvas.getWidth();
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
// so we don't interfere with the top bar info (datetime, free memory, ISO) when upside down
max_x -= (int)(2.5*gap_y);
}
@@ -1758,9 +1765,9 @@ public class DrawPreview {
Log.d(TAG, "canvas.getWidth()/2 + diff_x: " + (canvas.getWidth()/2+diff_x));
Log.d(TAG, "max_x: " + max_x);
}*/
- if( canvas.getWidth()/2 + diff_x > max_x ) {
+ if( mid_x + diff_x > max_x ) {
// in case goes off the size of the canvas, for "black bar" cases (when preview aspect ratio < screen aspect ratio)
- diff_x = max_x - canvas.getWidth()/2;
+ diff_x = max_x - mid_x;
}
text_base_y = canvas.getHeight()/2 + diff_x - (int)(0.5*gap_y);
}
@@ -1870,7 +1877,7 @@ public class DrawPreview {
}
}
else if( preview.isVideoRecording() ) {
- long video_time = preview.getVideoTime();
+ long video_time = preview.getVideoTime(false);
String time_s = getTimeStringFromSeconds(video_time/1000);
/*if( MyDebug.LOG )
Log.d(TAG, "video_time: " + video_time + " " + time_s);*/
@@ -1951,7 +1958,7 @@ public class DrawPreview {
p.setTextSize(14 * scale + 0.5f); // convert dps to pixels
p.setTextAlign(Paint.Align.CENTER);
int pixels_offset_y = 2*text_y; // avoid overwriting the zoom
- if( ui_rotation == 0 && applicationInterface.getPhotoMode() == MyApplicationInterface.PhotoMode.FocusBracketing ) {
+ if( device_ui_rotation == 0 && applicationInterface.getPhotoMode() == MyApplicationInterface.PhotoMode.FocusBracketing ) {
// avoid clashing with the target focus bracketing seekbar in landscape orientation
pixels_offset_y = 5*gap_y;
}
@@ -1988,8 +1995,8 @@ public class DrawPreview {
if( preview.supportsZoom() && show_zoom_pref ) {
float zoom_ratio = preview.getZoomRatio();
- // only show when actually zoomed in
- if( zoom_ratio > 1.0f + 1.0e-5f ) {
+ // only show when actually zoomed in - or out!
+ if( zoom_ratio < 1.0f - 1.0e-5f || zoom_ratio > 1.0f + 1.0e-5f ) {
// Convert the dps to pixels, based on density scale
p.setTextSize(14 * scale + 0.5f); // convert dps to pixels
p.setTextAlign(Paint.Align.CENTER);
@@ -2032,22 +2039,31 @@ public class DrawPreview {
// avoid computing every time, due to cost of calling View.getLocationOnScreen()
/*if( MyDebug.LOG )
Log.d(TAG, "update cached top_icon_shift");*/
- int top_margin = getViewOnScreenX(top_icon) + top_icon.getWidth();
+ int top_margin = getViewOnScreenX(top_icon);
+ if( system_orientation == MainActivity.SystemOrientation.LANDSCAPE )
+ top_margin += top_icon.getWidth();
+ else if( system_orientation == MainActivity.SystemOrientation.PORTRAIT )
+ top_margin += top_icon.getHeight();
+ // n.b., don't adjust top_margin for icon width/height for an reverse orientation
preview.getView().getLocationOnScreen(gui_location);
- int preview_left = gui_location[0];
+ int preview_left = gui_location[system_orientation_portrait ? 1 : 0];
+ if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE )
+ preview_left += preview.getView().getWidth(); // actually want preview-right for reverse landscape
this.top_icon_shift = top_margin - preview_left;
- if( MyDebug.LOG ) {
+ if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE )
+ this.top_icon_shift = -this.top_icon_shift;
+ /*if( MyDebug.LOG ) {
Log.d(TAG, "top_icon.getRotation(): " + top_icon.getRotation());
Log.d(TAG, "preview_left: " + preview_left);
Log.d(TAG, "top_margin: " + top_margin);
Log.d(TAG, "top_icon_shift: " + top_icon_shift);
- }
+ }*/
last_top_icon_shift_time = time_ms;
}
if( this.top_icon_shift > 0 ) {
- if( ui_rotation == 90 || ui_rotation == 270 ) {
+ if( device_ui_rotation == 90 || device_ui_rotation == 270 ) {
// portrait
top_y += top_icon_shift;
}
@@ -2077,15 +2093,51 @@ public class DrawPreview {
if( MyDebug.LOG )
Log.d(TAG, "set focus_seekbars_margin_left to " + focus_seekbars_margin_left);
+ // "left" and "right" here are written assuming we're in landscape system orientation
+
View view = main_activity.findViewById(R.id.focus_seekbar);
RelativeLayout.LayoutParams layoutParams = (RelativeLayout.LayoutParams)view.getLayoutParams();
- layoutParams.setMargins(focus_seekbars_margin_left, 0, 0, 0);
+ preview.getView().getLocationOnScreen(gui_location);
+ int preview_left = gui_location[system_orientation_portrait ? 1 : 0];
+ if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE )
+ preview_left += preview.getView().getWidth(); // actually want preview-right for reverse landscape
+
+ view.getLocationOnScreen(gui_location);
+ int seekbar_right = gui_location[system_orientation_portrait ? 1 : 0];
+ if( system_orientation == MainActivity.SystemOrientation.LANDSCAPE || system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ // n.b., we read view.getWidth() even if system_orientation is portrait, because the seekbar is rotated in portrait orientation
+ seekbar_right += view.getWidth();
+ }
+ else {
+ // and for reversed landscape, the seekbar is rotated 180 degrees, and getLocationOnScreen() returns the location after the rotation
+ seekbar_right -= view.getWidth();
+ }
+
+ int min_seekbar_width = (int) (150 * scale + 0.5f); // convert dps to pixels
+ int new_seekbar_width;
+ if( system_orientation == MainActivity.SystemOrientation.LANDSCAPE || system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ new_seekbar_width = seekbar_right - (preview_left+focus_seekbars_margin_left);
+ }
+ else {
+ // reversed landscape
+ new_seekbar_width = preview_left - focus_seekbars_margin_left - seekbar_right;
+ }
+ new_seekbar_width = Math.max(new_seekbar_width, min_seekbar_width);
+ /*if( MyDebug.LOG ) {
+ Log.d(TAG, "preview_left: " + preview_left);
+ Log.d(TAG, "seekbar_right: " + seekbar_right);
+ Log.d(TAG, "new_seekbar_width: " + new_seekbar_width);
+ }*/
+ layoutParams.width = new_seekbar_width;
view.setLayoutParams(layoutParams);
view = main_activity.findViewById(R.id.focus_bracketing_target_seekbar);
layoutParams = (RelativeLayout.LayoutParams)view.getLayoutParams();
- layoutParams.setMargins(focus_seekbars_margin_left, 0, 0, 0);
+ layoutParams.width = new_seekbar_width;
view.setLayoutParams(layoutParams);
+
+ // need to update due to changing width of focus seekbars
+ main_activity.getMainUI().setFocusSeekbarsRotation();
}
}
@@ -2094,14 +2146,15 @@ public class DrawPreview {
int battery_width = (int) (5 * scale + 0.5f); // convert dps to pixels
int battery_height = 4*battery_width;
if( ui_rotation == 90 || ui_rotation == 270 ) {
+ // n.b., this is only for when lock_to_landscape==true, so we don't look at device_ui_rotation
int diff = canvas.getWidth() - canvas.getHeight();
battery_x += diff/2;
battery_y -= diff/2;
}
- if( ui_rotation == 90 ) {
+ if( device_ui_rotation == 90 ) {
battery_y = canvas.getHeight() - battery_y - battery_height;
}
- if( ui_rotation == 180 ) {
+ if( device_ui_rotation == 180 ) {
battery_x = canvas.getWidth() - battery_x - battery_width;
}
if( show_battery_pref ) {
@@ -2137,7 +2190,7 @@ public class DrawPreview {
top_x += (int) (10 * scale + 0.5f); // convert dps to pixels
}
- onDrawInfoLines(canvas, top_x, top_y, text_base_y, time_ms);
+ onDrawInfoLines(canvas, top_x, top_y, text_base_y, device_ui_rotation, time_ms);
canvas.restore();
}
@@ -2147,9 +2200,11 @@ public class DrawPreview {
return "· " + zoom_ratio + "X" + " ·";
}
- private void drawAngleLines(Canvas canvas, long time_ms) {
+ private void drawAngleLines(Canvas canvas, int device_ui_rotation, long time_ms) {
Preview preview = main_activity.getPreview();
CameraController camera_controller = preview.getCameraController();
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
boolean has_level_angle = preview.hasLevelAngle();
boolean actual_show_angle_line_pref;
if( photoMode == MyApplicationInterface.PhotoMode.Panorama ) {
@@ -2162,28 +2217,36 @@ public class DrawPreview {
boolean allow_angle_lines = camera_controller != null && !preview.isPreviewPaused();
if( allow_angle_lines && has_level_angle && ( actual_show_angle_line_pref || show_pitch_lines_pref || show_geo_direction_lines_pref ) ) {
- int ui_rotation = preview.getUIRotation();
double level_angle = preview.getLevelAngle();
boolean has_pitch_angle = preview.hasPitchAngle();
double pitch_angle = preview.getPitchAngle();
boolean has_geo_direction = preview.hasGeoDirection();
double geo_direction = preview.getGeoDirection();
// n.b., must draw this without the standard canvas rotation
- int radius_dps = (ui_rotation == 90 || ui_rotation == 270) ? 60 : 80;
+ // lines should be shorter in portrait
+ int radius_dps = (device_ui_rotation == 90 || device_ui_rotation == 270) ? 60 : 80;
int radius = (int) (radius_dps * scale + 0.5f); // convert dps to pixels
double angle = - preview.getOrigLevelAngle();
// see http://android-developers.blogspot.co.uk/2010/09/one-screen-turn-deserves-another.html
- int rotation = main_activity.getWindowManager().getDefaultDisplay().getRotation();
+ int rotation = main_activity.getDisplayRotation();
switch (rotation) {
case Surface.ROTATION_90:
- case Surface.ROTATION_270:
angle -= 90.0;
break;
- case Surface.ROTATION_0:
+ case Surface.ROTATION_270:
+ angle += 90.0;
+ break;
case Surface.ROTATION_180:
+ angle += 180.0;
+ break;
+ case Surface.ROTATION_0:
default:
break;
}
+ /*if( MyDebug.LOG ) {
+ Log.d(TAG, "system_orientation: " + system_orientation);
+ Log.d(TAG, "rotation: " + rotation);
+ }*/
/*if( MyDebug.LOG ) {
Log.d(TAG, "orig_level_angle: " + preview.getOrigLevelAngle());
Log.d(TAG, "angle: " + angle);
@@ -2247,8 +2310,15 @@ public class DrawPreview {
}
}
updateCachedViewAngles(time_ms); // ensure view_angle_x_preview, view_angle_y_preview are computed and up to date
- float camera_angle_x = this.view_angle_x_preview;
- float camera_angle_y = this.view_angle_y_preview;
+ float camera_angle_x, camera_angle_y;
+ if( system_orientation_portrait ) {
+ camera_angle_x = this.view_angle_y_preview;
+ camera_angle_y = this.view_angle_x_preview;
+ }
+ else {
+ camera_angle_x = this.view_angle_x_preview;
+ camera_angle_y = this.view_angle_y_preview;
+ }
float angle_scale_x = (float)( canvas.getWidth() / (2.0 * Math.tan( Math.toRadians((camera_angle_x/2.0)) )) );
float angle_scale_y = (float)( canvas.getHeight() / (2.0 * Math.tan( Math.toRadians((camera_angle_y/2.0)) )) );
/*if( MyDebug.LOG ) {
@@ -2266,7 +2336,8 @@ public class DrawPreview {
float angle_scale = (float)Math.sqrt( angle_scale_x*angle_scale_x + angle_scale_y*angle_scale_y );
angle_scale *= preview.getZoomRatio();
if( has_pitch_angle && show_pitch_lines_pref ) {
- int pitch_radius_dps = (ui_rotation == 90 || ui_rotation == 270) ? 100 : 80;
+ // lines should be shorter in portrait
+ int pitch_radius_dps = (device_ui_rotation == 90 || device_ui_rotation == 270) ? 80 : 100;
int pitch_radius = (int) (pitch_radius_dps * scale + 0.5f); // convert dps to pixels
int angle_step = 10;
if( preview.getZoomRatio() >= 2.0f )
@@ -2308,7 +2379,9 @@ public class DrawPreview {
}
}
if( has_geo_direction && has_pitch_angle && show_geo_direction_lines_pref ) {
- int geo_radius_dps = (ui_rotation == 90 || ui_rotation == 270) ? 80 : 100;
+ // lines should be longer in portrait - n.b., this is opposite to behaviour of pitch lines, as
+ // geo lines are drawn perpendicularly
+ int geo_radius_dps = (device_ui_rotation == 90 || device_ui_rotation == 270) ? 100 : 80;
int geo_radius = (int) (geo_radius_dps * scale + 0.5f); // convert dps to pixels
float geo_angle = (float)Math.toDegrees(geo_direction);
int angle_step = 10;
@@ -2552,6 +2625,12 @@ public class DrawPreview {
}
}
+ public void setCoverPreview(boolean cover_preview) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "setCoverPreview: " + cover_preview);
+ this.cover_preview = cover_preview;
+ }
+
public void onDrawPreview(Canvas canvas) {
/*if( MyDebug.LOG )
Log.d(TAG, "onDrawPreview");*/
@@ -2592,10 +2671,26 @@ public class DrawPreview {
preview.disableFocusPeaking();
}
- // see documentation for CameraController.shouldCoverPreview()
- if( preview.usingCamera2API() && ( camera_controller == null || camera_controller.shouldCoverPreview() ) ) {
- p.setColor(Color.BLACK);
- canvas.drawRect(0.0f, 0.0f, canvas.getWidth(), canvas.getHeight(), p);
+ // See documentation for CameraController.shouldCoverPreview().
+ // Note, originally we checked camera_controller.shouldCoverPreview() every frame, but this
+ // has the problem that we blank whenever the camera is being reopened, e.g., when switching
+ // cameras or changing photo modes that require a reopen. The intent however is to only
+ // cover up the camera when the application is pausing, and to keep it covered up until
+ // after we've resumed, and the camera has been reopened and we've received frames.
+ if( preview.usingCamera2API() ) {
+ if( cover_preview ) {
+ // see if we have received a frame yet
+ if( camera_controller != null && !camera_controller.shouldCoverPreview() ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "no longer need to cover preview");
+ cover_preview = false;
+ }
+ }
+ if( cover_preview ) {
+ p.setColor(Color.BLACK);
+ //p.setColor(Color.RED); // test
+ canvas.drawRect(0.0f, 0.0f, canvas.getWidth(), canvas.getHeight(), p);
+ }
}
if( camera_controller!= null && front_screen_flash ) {
@@ -2609,6 +2704,19 @@ public class DrawPreview {
p.setAlpha(255);
}
+ // If MainActivity.lock_to_landscape==true, then the ui_rotation represents the orientation of the
+ // device; if MainActivity.lock_to_landscape==false then ui_rotation is always 0 as we don't need to
+ // apply any orientation ourselves. However, we're we do want to know the true rotation of the
+ // device, as it affects how certain elements of the UI are layed out.
+ int device_ui_rotation;
+ if( MainActivity.lock_to_landscape ) {
+ device_ui_rotation = ui_rotation;
+ }
+ else {
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ device_ui_rotation = MainActivity.getRotationFromSystemOrientation(system_orientation);
+ }
+
if( camera_controller != null && taking_picture && !front_screen_flash && take_photo_border_pref ) {
p.setColor(Color.WHITE);
p.setStyle(Paint.Style.STROKE);
@@ -2677,9 +2785,9 @@ public class DrawPreview {
doThumbnailAnimation(canvas, time_ms);
- drawUI(canvas, time_ms);
+ drawUI(canvas, device_ui_rotation, time_ms);
- drawAngleLines(canvas, time_ms);
+ drawAngleLines(canvas, device_ui_rotation, time_ms);
doFocusAnimation(canvas, time_ms);
@@ -2691,7 +2799,7 @@ public class DrawPreview {
for(CameraController.Face face : faces_detected) {
// Android doc recommends filtering out faces with score less than 50 (same for both Camera and Camera2 APIs)
if( face.score >= 50 ) {
- canvas.drawRect(face.rect, p);
+ canvas.drawRect(face.temp, p);
}
}
p.setStyle(Paint.Style.FILL); // reset
@@ -2700,17 +2808,33 @@ public class DrawPreview {
if( enable_gyro_target_spot && camera_controller != null ) {
GyroSensor gyroSensor = main_activity.getApplicationInterface().getGyroSensor();
if( gyroSensor.isRecording() ) {
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
for(float [] gyro_direction : gyro_directions) {
gyroSensor.getRelativeInverseVector(transformed_gyro_direction, gyro_direction);
gyroSensor.getRelativeInverseVector(transformed_gyro_direction_up, gyro_direction_up);
// note that although X of gyro_direction represents left to right on the device, because we're in landscape mode,
// this is y coordinates on the screen
- float angle_x = - (float)Math.asin(transformed_gyro_direction[1]);
- float angle_y = - (float)Math.asin(transformed_gyro_direction[0]);
+ float angle_x, angle_y;
+ if( system_orientation_portrait ) {
+ angle_x = (float)Math.asin(transformed_gyro_direction[0]);
+ angle_y = - (float)Math.asin(transformed_gyro_direction[1]);
+ }
+ else {
+ angle_x = - (float)Math.asin(transformed_gyro_direction[1]);
+ angle_y = - (float)Math.asin(transformed_gyro_direction[0]);
+ }
if( Math.abs(angle_x) < 0.5f*Math.PI && Math.abs(angle_y) < 0.5f*Math.PI ) {
updateCachedViewAngles(time_ms); // ensure view_angle_x_preview, view_angle_y_preview are computed and up to date
- float camera_angle_x = this.view_angle_x_preview;
- float camera_angle_y = this.view_angle_y_preview;
+ float camera_angle_x, camera_angle_y;
+ if( system_orientation_portrait ) {
+ camera_angle_x = this.view_angle_y_preview;
+ camera_angle_y = this.view_angle_x_preview;
+ }
+ else {
+ camera_angle_x = this.view_angle_x_preview;
+ camera_angle_y = this.view_angle_y_preview;
+ }
float angle_scale_x = (float) (canvas.getWidth() / (2.0 * Math.tan(Math.toRadians((camera_angle_x / 2.0)))));
float angle_scale_y = (float) (canvas.getHeight() / (2.0 * Math.tan(Math.toRadians((camera_angle_y / 2.0)))));
angle_scale_x *= preview.getZoomRatio();
@@ -2794,7 +2918,7 @@ public class DrawPreview {
}
}
- private void drawGyroSpot(Canvas canvas, float distance_x, float distance_y, @SuppressWarnings("unused") float dir_x, @SuppressWarnings("unused") float dir_y, int radius_dp, boolean outline) {
+ private void drawGyroSpot(Canvas canvas, float distance_x, float distance_y, float dir_x, float dir_y, int radius_dp, boolean outline) {
if( outline ) {
p.setStyle(Paint.Style.STROKE);
p.setStrokeWidth(stroke_width);
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/FolderChooserDialog.java b/app/src/main/java/net/sourceforge/opencamera/ui/FolderChooserDialog.java
index 6d97475f6f02e22dd7a272e22d32718896082435..299d30943d7e2ed847fd99592f0e8e3e94883365 100644
--- a/app/src/main/java/net/sourceforge/opencamera/ui/FolderChooserDialog.java
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/FolderChooserDialog.java
@@ -21,6 +21,7 @@ import android.text.InputFilter;
import android.text.Spanned;
import android.util.Log;
import android.util.TypedValue;
+import android.view.LayoutInflater;
import android.view.View;
import android.widget.AdapterView;
import android.widget.AdapterView.OnItemClickListener;
@@ -43,6 +44,7 @@ public class FolderChooserDialog extends DialogFragment {
private File start_folder = new File("");
private File current_folder;
+ private File max_parent; // if non-null, don't show the Parent option if viewing this folder (so the user can't go above that folder)
private AlertDialog folder_dialog;
private ListView list;
private String chosen_folder;
@@ -59,6 +61,7 @@ public class FolderChooserDialog extends DialogFragment {
this.sort_order = sort_order;
}
+ @NonNull
@Override
public String toString() {
if( override_name != null )
@@ -184,12 +187,17 @@ public class FolderChooserDialog extends DialogFragment {
// see testFolderChooserInvalid()
if( MyDebug.LOG )
Log.d(TAG, "failed to read folder");
- // note that we reset to DCIM rather than DCIM/OpenCamera, just to increase likelihood of getting back to a valid state
- refreshList(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM));
- if( current_folder == null ) {
+
+ if( show_dcim_shortcut ) {
if( MyDebug.LOG )
- Log.d(TAG, "can't even read DCIM?!");
- refreshList(new File("/"));
+ Log.d(TAG, "fall back to DCIM");
+ // note that we reset to DCIM rather than DCIM/OpenCamera, just to increase likelihood of getting back to a valid state
+ refreshList(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM));
+ if( current_folder == null ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "can't even read DCIM?!");
+ refreshList(new File("/"));
+ }
}
}
return folder_dialog;
@@ -199,6 +207,12 @@ public class FolderChooserDialog extends DialogFragment {
this.start_folder = start_folder;
}
+ public void setMaxParent(File max_parent) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "setMaxParent: " + max_parent);
+ this.max_parent = max_parent;
+ }
+
public void setShowNewFolderButton(boolean show_new_folder_button) {
this.show_new_folder_button = show_new_folder_button;
}
@@ -236,8 +250,14 @@ public class FolderChooserDialog extends DialogFragment {
// n.b., files may be null if no files could be found in the folder (or we can't read) - but should still allow the user
// to view this folder (so the user can go to parent folders which might be readable again)
List listed_files = new ArrayList<>();
- if( new_folder.getParentFile() != null )
- listed_files.add(new FileWrapper(new_folder.getParentFile(), getResources().getString(R.string.parent_folder), 0));
+ if( new_folder.getParentFile() != null ) {
+ if( max_parent != null && max_parent.equals(new_folder) ) {
+ // don't show parent option
+ }
+ else {
+ listed_files.add(new FileWrapper(new_folder.getParentFile(), getResources().getString(R.string.parent_folder), 0));
+ }
+ }
if( show_dcim_shortcut ) {
File default_folder = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM);
if( !default_folder.equals(new_folder) && !default_folder.equals(new_folder.getParentFile()) )
@@ -296,12 +316,14 @@ public class FolderChooserDialog extends DialogFragment {
if( current_folder == null )
return false;
if( canWrite() ) {
- File base_folder = StorageUtils.getBaseFolder();
String new_save_location = current_folder.getAbsolutePath();
- if( current_folder.getParentFile() != null && current_folder.getParentFile().equals(base_folder) ) {
- if( MyDebug.LOG )
- Log.d(TAG, "parent folder is base folder");
- new_save_location = current_folder.getName();
+ if( this.show_dcim_shortcut ) {
+ File base_folder = StorageUtils.getBaseFolder();
+ if( current_folder.getParentFile() != null && current_folder.getParentFile().equals(base_folder) ) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "parent folder is base folder");
+ new_save_location = current_folder.getName();
+ }
}
if( MyDebug.LOG )
Log.d(TAG, "new_save_location: " + new_save_location);
@@ -349,17 +371,21 @@ public class FolderChooserDialog extends DialogFragment {
if( current_folder == null )
return;
if( canWrite() ) {
- final EditText edit_text = new EditText(getActivity());
+ final View dialog_view = LayoutInflater.from(getActivity()).inflate(R.layout.alertdialog_edittext, null);
+ final EditText edit_text = dialog_view.findViewById(R.id.edit_text);
+
edit_text.setSingleLine();
edit_text.setTextSize(TypedValue.COMPLEX_UNIT_DIP, 20.0f);
- edit_text.setContentDescription(getResources().getString(R.string.enter_new_folder));
+ // set hint instead of content description for EditText, see https://support.google.com/accessibility/android/answer/6378120
+ //edit_text.setContentDescription(getResources().getString(R.string.enter_new_folder));
+ edit_text.setHint(getResources().getString(R.string.enter_new_folder));
InputFilter filter = new NewFolderInputFilter();
edit_text.setFilters(new InputFilter[]{filter});
Dialog dialog = new AlertDialog.Builder(getActivity())
//.setIcon(R.drawable.alert_dialog_icon)
.setTitle(R.string.enter_new_folder)
- .setView(edit_text)
+ .setView(dialog_view)
.setPositiveButton(android.R.string.ok, new Dialog.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/MainUI.java b/app/src/main/java/net/sourceforge/opencamera/ui/MainUI.java
index 043fcade5a7065ebac4b9f10cf1963a75eac0a22..bbe2fa97d6c619bf5d797a517cb53009026db2e0 100644
--- a/app/src/main/java/net/sourceforge/opencamera/ui/MainUI.java
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/MainUI.java
@@ -31,6 +31,8 @@ import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import android.widget.SeekBar;
+import androidx.constraintlayout.widget.ConstraintLayout;
+
import net.sourceforge.opencamera.MainActivity;
import net.sourceforge.opencamera.MyApplicationInterface;
import net.sourceforge.opencamera.MyDebug;
@@ -71,7 +73,9 @@ public class MainUI {
private UIPlacement ui_placement = UIPlacement.UIPLACEMENT_RIGHT;
private View top_icon = null;
private boolean view_rotate_animation;
+ private float view_rotate_animation_start; // for MainActivity.lock_to_landscape==false
private final static int view_rotate_animation_duration = 100; // duration in ms of the icon rotation animation
+ public final static int privacy_indicator_gap_dp = 24;
private boolean immersive_mode;
private boolean show_gui_photo = true; // result of call to showGUI() - false means a "reduced" GUI is displayed, whilst taking photo or video
@@ -153,6 +157,12 @@ public class MainUI {
if (!view_rotate_animation) {
view.setRotation(ui_rotation);
}
+ if( !MainActivity.lock_to_landscape ) {
+ float start_rotation = view_rotate_animation_start + ui_rotation;
+ if( start_rotation >= 360.0f )
+ start_rotation -= 360.0f;
+ view.setRotation(start_rotation);
+ }
float rotate_by = ui_rotation - view.getRotation();
if (rotate_by > 181.0f)
rotate_by -= 360.0f;
@@ -160,13 +170,31 @@ public class MainUI {
rotate_by += 360.0f;
// view.animate() modifies the view's rotation attribute, so it ends up equivalent to view.setRotation()
// we use rotationBy() instead of rotation(), so we get the minimal rotation for clockwise vs anti-clockwise
- view.animate().rotationBy(rotate_by).setDuration(view_rotate_animation_duration).setInterpolator(new AccelerateDecelerateInterpolator()).start();
+ if( main_activity.is_test && Build.VERSION.SDK_INT <= Build.VERSION_CODES.JELLY_BEAN_MR2 ) {
+ // We randomly get a java.lang.ArrayIndexOutOfBoundsException crash when running MainTests suite
+ // on Android emulator with Android 4.3, from deep below ViewPropertyAnimator.start().
+ // Unclear why this is - I haven't seen this on real devices and can't find out info about it.
+ view.setRotation(ui_rotation);
+ }
+ else {
+ view.animate().rotationBy(rotate_by).setDuration(view_rotate_animation_duration).setInterpolator(new AccelerateDecelerateInterpolator()).start();
+ }
}
public void layoutUI() {
layoutUI(false);
}
+ public void layoutUIWithRotation(float view_rotate_animation_start) {
+ if( MyDebug.LOG )
+ Log.d(TAG, "layoutUIWithRotation: " + view_rotate_animation_start);
+ this.view_rotate_animation = true;
+ this.view_rotate_animation_start = view_rotate_animation_start;
+ layoutUI();
+ view_rotate_animation = false;
+ this.view_rotate_animation_start = 0.0f;
+ }
+
private UIPlacement computeUIPlacement() {
return UIPlacement.UIPLACEMENT_TOP;
}
@@ -178,82 +206,136 @@ public class MainUI {
debug_time = System.currentTimeMillis();
}
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ boolean system_orientation_portrait = system_orientation == MainActivity.SystemOrientation.PORTRAIT;
+ boolean system_orientation_reversed_landscape = system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " system_orientation = " + system_orientation);
+ Log.d(TAG, " system_orientation_portrait? " + system_orientation_portrait);
+ }
+
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
// we cache the preference_ui_placement to save having to check it in the draw() method
this.ui_placement = computeUIPlacement();
if (MyDebug.LOG)
Log.d(TAG, "ui_placement: " + ui_placement);
- // new code for orientation fixed to landscape
- // the display orientation should be locked to landscape, but how many degrees is that?
- int rotation = main_activity.getWindowManager().getDefaultDisplay().getRotation();
- int degrees = 0;
- switch (rotation) {
- case Surface.ROTATION_0:
- degrees = 0;
- break;
- case Surface.ROTATION_90:
- degrees = 90;
- break;
- case Surface.ROTATION_180:
- degrees = 180;
- break;
- case Surface.ROTATION_270:
- degrees = 270;
- break;
- default:
- break;
+ int relative_orientation;
+ if( MainActivity.lock_to_landscape ) {
+ // new code for orientation fixed to landscape
+ // the display orientation should be locked to landscape, but how many degrees is that?
+ int rotation = main_activity.getWindowManager().getDefaultDisplay().getRotation();
+ int degrees = 0;
+ switch (rotation) {
+ case Surface.ROTATION_0: degrees = 0; break;
+ case Surface.ROTATION_90: degrees = 90; break;
+ case Surface.ROTATION_180: degrees = 180; break;
+ case Surface.ROTATION_270: degrees = 270; break;
+ default:
+ break;
+ }
+ // getRotation is anti-clockwise, but current_orientation is clockwise, so we add rather than subtract
+ // relative_orientation is clockwise from landscape-left
+ //int relative_orientation = (current_orientation + 360 - degrees) % 360;
+ relative_orientation = (current_orientation + degrees) % 360;
+ if( MyDebug.LOG ) {
+ Log.d(TAG, " current_orientation = " + current_orientation);
+ Log.d(TAG, " degrees = " + degrees);
+ Log.d(TAG, " relative_orientation = " + relative_orientation);
+ }
}
- // getRotation is anti-clockwise, but current_orientation is clockwise, so we add rather than subtract
- // relative_orientation is clockwise from landscape-left
- //int relative_orientation = (current_orientation + 360 - degrees) % 360;
- int relative_orientation = (current_orientation + degrees) % 360;
- if (MyDebug.LOG) {
- Log.d(TAG, " current_orientation = " + current_orientation);
- Log.d(TAG, " degrees = " + degrees);
- Log.d(TAG, " relative_orientation = " + relative_orientation);
+ else {
+ relative_orientation = 0;
}
final int ui_rotation = (360 - relative_orientation) % 360;
main_activity.getPreview().setUIRotation(ui_rotation);
- int align_left = RelativeLayout.ALIGN_LEFT;
- int align_right = RelativeLayout.ALIGN_RIGHT;
- //int align_top = RelativeLayout.ALIGN_TOP;
- //int align_bottom = RelativeLayout.ALIGN_BOTTOM;
- int left_of = RelativeLayout.LEFT_OF;
- int right_of = RelativeLayout.RIGHT_OF;
+ // naming convention for variables is for system_orientation==LANDSCAPE, right-handed UI
+ int align_left = system_orientation_portrait ? RelativeLayout.ALIGN_TOP : RelativeLayout.ALIGN_LEFT;
+ int align_right = system_orientation_portrait ? RelativeLayout.ALIGN_BOTTOM : RelativeLayout.ALIGN_RIGHT;
+ int align_top = system_orientation_portrait ? RelativeLayout.ALIGN_RIGHT : RelativeLayout.ALIGN_TOP;
+ int align_bottom = system_orientation_portrait ? RelativeLayout.ALIGN_LEFT : RelativeLayout.ALIGN_BOTTOM;
+ int left_of = system_orientation_portrait ? RelativeLayout.ABOVE : RelativeLayout.LEFT_OF;
+ int right_of = system_orientation_portrait ? RelativeLayout.BELOW : RelativeLayout.RIGHT_OF;
+ int above = system_orientation_portrait ? RelativeLayout.RIGHT_OF : RelativeLayout.ABOVE;
+ int below = system_orientation_portrait ? RelativeLayout.LEFT_OF : RelativeLayout.BELOW;
+ int ui_independent_left_of = left_of;
+ int ui_independent_right_of = right_of;
+ int ui_independent_above = above;
+ int ui_independent_below = below;
+ int align_parent_left = system_orientation_portrait ? RelativeLayout.ALIGN_PARENT_TOP : RelativeLayout.ALIGN_PARENT_LEFT;
+ int align_parent_right = system_orientation_portrait ? RelativeLayout.ALIGN_PARENT_BOTTOM : RelativeLayout.ALIGN_PARENT_RIGHT;
+ int align_parent_top = system_orientation_portrait ? RelativeLayout.ALIGN_PARENT_RIGHT : RelativeLayout.ALIGN_PARENT_TOP;
+ int align_parent_bottom = system_orientation_portrait ? RelativeLayout.ALIGN_PARENT_LEFT : RelativeLayout.ALIGN_PARENT_BOTTOM;
+ int center_horizontal = system_orientation_portrait ? RelativeLayout.CENTER_VERTICAL : RelativeLayout.CENTER_HORIZONTAL;
+ int center_vertical = system_orientation_portrait ? RelativeLayout.CENTER_HORIZONTAL : RelativeLayout.CENTER_VERTICAL;
+
int iconpanel_left_of = left_of;
int iconpanel_right_of = right_of;
- int above = RelativeLayout.ABOVE;
- int below = RelativeLayout.BELOW;
int iconpanel_above = above;
int iconpanel_below = below;
- int align_parent_left = RelativeLayout.ALIGN_PARENT_LEFT;
- int align_parent_right = RelativeLayout.ALIGN_PARENT_RIGHT;
int iconpanel_align_parent_left = align_parent_left;
int iconpanel_align_parent_right = align_parent_right;
- int align_parent_top = RelativeLayout.ALIGN_PARENT_TOP;
- int align_parent_bottom = RelativeLayout.ALIGN_PARENT_BOTTOM;
int iconpanel_align_parent_top = align_parent_top;
int iconpanel_align_parent_bottom = align_parent_bottom;
- if (ui_placement == UIPlacement.UIPLACEMENT_LEFT) {
- above = RelativeLayout.BELOW;
- below = RelativeLayout.ABOVE;
- align_parent_top = RelativeLayout.ALIGN_PARENT_BOTTOM;
- align_parent_bottom = RelativeLayout.ALIGN_PARENT_TOP;
+
+ if( system_orientation_reversed_landscape ) {
+ int temp = align_left;
+ align_left = align_right;
+ align_right = temp;
+ temp = align_top;
+ align_top = align_bottom;
+ align_bottom = temp;
+ temp = left_of;
+ left_of = right_of;
+ right_of = temp;
+ temp = above;
+ above = below;
+ below = temp;
+
+ ui_independent_left_of = left_of;
+ ui_independent_right_of = right_of;
+ ui_independent_above = above;
+ ui_independent_below = below;
+
+ temp = align_parent_left;
+ align_parent_left = align_parent_right;
+ align_parent_right = temp;
+ temp = align_parent_top;
+ align_parent_top = align_parent_bottom;
+ align_parent_bottom = temp;
+
+ iconpanel_left_of = left_of;
+ iconpanel_right_of = right_of;
+ iconpanel_above = above;
+ iconpanel_below = below;
+ iconpanel_align_parent_left = align_parent_left;
+ iconpanel_align_parent_right = align_parent_right;
iconpanel_align_parent_top = align_parent_top;
iconpanel_align_parent_bottom = align_parent_bottom;
- } else if (ui_placement == UIPlacement.UIPLACEMENT_TOP) {
- iconpanel_left_of = RelativeLayout.BELOW;
- iconpanel_right_of = RelativeLayout.ABOVE;
- iconpanel_above = RelativeLayout.LEFT_OF;
- iconpanel_below = RelativeLayout.RIGHT_OF;
+ }
+
+ if( ui_placement == UIPlacement.UIPLACEMENT_LEFT ) {
+ int temp = above;
+ above = below;
+ below = temp;
+ temp = align_parent_top;
+ align_parent_top = align_parent_bottom;
+ align_parent_bottom = temp;
+ iconpanel_align_parent_top = align_parent_top;
+ iconpanel_align_parent_bottom = align_parent_bottom;
+ }
+ else if( ui_placement == UIPlacement.UIPLACEMENT_TOP ) {
+ iconpanel_left_of = below;
+ iconpanel_right_of = above;
+ iconpanel_above = left_of;
+ iconpanel_below = right_of;
//noinspection SuspiciousNameCombination
- iconpanel_align_parent_left = RelativeLayout.ALIGN_PARENT_BOTTOM;
+ iconpanel_align_parent_left = align_parent_bottom;
//noinspection SuspiciousNameCombination
- iconpanel_align_parent_right = RelativeLayout.ALIGN_PARENT_TOP;
+ iconpanel_align_parent_right = align_parent_top;
//noinspection SuspiciousNameCombination
- iconpanel_align_parent_top = RelativeLayout.ALIGN_PARENT_LEFT;
+ iconpanel_align_parent_top = align_parent_left;
//noinspection SuspiciousNameCombination
- iconpanel_align_parent_bottom = RelativeLayout.ALIGN_PARENT_RIGHT;
+ iconpanel_align_parent_bottom = align_parent_right;
}
Point display_size = new Point();
@@ -261,6 +343,10 @@ public class MainUI {
display.getSize(display_size);
final int display_height = Math.min(display_size.x, display_size.y);
+ final float scale = main_activity.getResources().getDisplayMetrics().density;
+ if( MyDebug.LOG )
+ Log.d(TAG, "scale: " + scale);
+
/*int navigation_gap = 0;
if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1 ) {
final int display_width = Math.max(display_size.x, display_size.y);
@@ -275,9 +361,34 @@ public class MainUI {
}
}*/
int navigation_gap = main_activity.getNavigationGap();
+ int gallery_navigation_gap = navigation_gap;
+
+ int gallery_top_gap = 0;
+ {
+ // Leave space for the Android 12+ camera privacy indicator, as gallery icon would
+ // otherwise overlap when in landscape orientation.
+ // In theory we should use WindowInsets.getPrivacyIndicatorBounds() for this, but it seems
+ // to give a much larger value when required (leaving to a much larger gap), as well as
+ // obviously changing depending on orientation - but whilst this is only an issue for
+ // landscape orientation, it looks better to keep the position consistent for any
+ // orientation (otherwise the icons jump about when changing orientation, which looks
+ // especially bad for UIPLACEMENT_RIGHT.
+ // Not needed for UIPLACEMENT_LEFT - although still adjust the right hand side margin
+ // for consistency.
+ // We do for all Android versions for consistency (avoids testing overhead due to
+ // different behaviour on different Android versions).
+ if( ui_placement != UIPlacement.UIPLACEMENT_LEFT ) {
+ // if we did want to do this for UIPLACEMENT_LEFT for consistency, it'd be the
+ // "bottom" margin we need to change.
+ gallery_top_gap = (int) (privacy_indicator_gap_dp * scale + 0.5f); // convert dps to pixels
+ }
+ int privacy_indicator_gap = (int) (privacy_indicator_gap_dp * scale + 0.5f); // convert dps to pixels
+ gallery_navigation_gap += privacy_indicator_gap;
+ }
test_navigation_gap = navigation_gap;
if (MyDebug.LOG) {
Log.d(TAG, "navigation_gap: " + navigation_gap);
+ Log.d(TAG, "gallery_navigation_gap: " + gallery_navigation_gap);
}
if (!popup_container_only) {
@@ -408,7 +519,9 @@ public class MainUI {
// is displayed (when taking a photo) if it is still shown left-most, rather than centred; also
// needed for "pause preview" trash/icons to be shown properly (test by rotating the phone to update
// the layout)
- layoutParams.setMargins(0, this_view == first_visible_view ? topMargin : margin / 2, 0, this_view == last_visible_view ? topMargin : margin / 2);
+ int margin_first = this_view==first_visible_view ? topMargin : margin/2;
+ int margin_last = this_view==last_visible_view ? topMargin : margin/2;
+ setMarginsForSystemUI(layoutParams, 0, margin_first, 0, margin_last);
layoutParams.width = button_size;
layoutParams.height = button_size;
this_view.setLayoutParams(layoutParams);
@@ -423,12 +536,20 @@ public class MainUI {
}
} else {
// need to reset size/margins to their default
- for (View this_view : buttons_permanent) {
- layoutParams = (RelativeLayout.LayoutParams) this_view.getLayoutParams();
- layoutParams.setMargins(0, 0, 0, 0);
- layoutParams.width = button_size;
- layoutParams.height = button_size;
- this_view.setLayoutParams(layoutParams);
+ // except for gallery, which still needs its margins set for navigation gap! (and we
+ // shouldn't change it's size, which isn't necessarily button_size)
+ view = main_activity.findViewById(R.id.gallery);
+ layoutParams = (RelativeLayout.LayoutParams) view.getLayoutParams();
+ setMarginsForSystemUI(layoutParams, 0, gallery_top_gap, gallery_navigation_gap, 0);
+ view.setLayoutParams(layoutParams);
+ for(View this_view : buttons_permanent) {
+ if( this_view != view ) {
+ layoutParams = (RelativeLayout.LayoutParams)this_view.getLayoutParams();
+ layoutParams.setMargins(0, 0, 0, 0);
+ layoutParams.width = button_size;
+ layoutParams.height = button_size;
+ this_view.setLayoutParams(layoutParams);
+ }
}
}
@@ -481,8 +602,12 @@ public class MainUI {
layoutParams.addRule(align_right, 0);
// layoutParams.addRule(left_of, R.id.zoom_seekbar);
layoutParams.addRule(right_of, 0);
+ layoutParams.addRule(above, 0);
+ layoutParams.addRule(below, 0);
layoutParams.addRule(align_parent_top, 0);
layoutParams.addRule(align_parent_bottom, RelativeLayout.TRUE);
+ layoutParams.addRule(align_parent_left, 0);
+ layoutParams.addRule(align_parent_right, 0);
view.setLayoutParams(layoutParams);
view = main_activity.findViewById(R.id.focus_bracketing_target_seekbar);
@@ -494,12 +619,14 @@ public class MainUI {
layoutParams.addRule(above, R.id.focus_seekbar);
layoutParams.addRule(below, 0);
view.setLayoutParams(layoutParams);
+
+ setFocusSeekbarsRotation();
}
if (!popup_container_only) {
// set seekbar info
int width_dp;
- if (ui_rotation == 0 || ui_rotation == 180) {
+ if( !system_orientation_portrait && (ui_rotation == 0 || ui_rotation == 180) ) {
// landscape
width_dp = 350;
} else {
@@ -513,7 +640,6 @@ public class MainUI {
if (MyDebug.LOG)
Log.d(TAG, "width_dp: " + width_dp);
int height_dp = 50;
- final float scale = main_activity.getResources().getDisplayMetrics().density;
int width_pixels = (int) (width_dp * scale + 0.5f); // convert dps to pixels
int height_pixels = (int) (height_dp * scale + 0.5f); // convert dps to pixels
@@ -522,10 +648,14 @@ public class MainUI {
view.setTranslationX(0.0f);
view.setTranslationY(0.0f);
- if (ui_rotation == 90 || ui_rotation == 270) {
+ if( system_orientation_portrait || ui_rotation == 90 || ui_rotation == 270 ) {
// portrait
- view.setTranslationX(2 * height_pixels);
- } else if (ui_rotation == 0) {
+ if( system_orientation_portrait )
+ view.setTranslationY(2*height_pixels);
+ else
+ view.setTranslationX(2*height_pixels);
+ }
+ else if( ui_rotation == 0 ) {
// landscape
view.setTranslationY(height_pixels);
} else {
@@ -536,9 +666,12 @@ public class MainUI {
/*
// align sliders_container
RelativeLayout.LayoutParams lp = (RelativeLayout.LayoutParams)view.getLayoutParams();
- if( ui_rotation == 90 || ui_rotation == 270 ) {
+ if( system_orientation_portrait || ui_rotation == 90 || ui_rotation == 270 ) {
// portrait
- view.setTranslationX(2*height_pixels);
+ if( system_orientation_portrait )
+ view.setTranslationY(2*height_pixels);
+ else
+ view.setTranslationX(2*height_pixels);
lp.addRule(left_of, 0);
lp.addRule(right_of, 0);
lp.addRule(above, 0);
@@ -615,20 +748,35 @@ public class MainUI {
RelativeLayout.LayoutParams layoutParams = (RelativeLayout.LayoutParams) view.getLayoutParams();
if (ui_placement == UIPlacement.UIPLACEMENT_TOP) {
layoutParams.addRule(align_right, 0);
+ layoutParams.addRule(align_bottom, 0);
+ layoutParams.addRule(align_left, 0);
+ layoutParams.addRule(align_top, 0);
layoutParams.addRule(above, 0);
layoutParams.addRule(below, 0);
layoutParams.addRule(left_of, 0);
layoutParams.addRule(right_of, R.id.top_bg);
- layoutParams.addRule(align_parent_top, RelativeLayout.TRUE);
- layoutParams.addRule(align_parent_bottom, RelativeLayout.TRUE);
- } else {
+ layoutParams.addRule(align_parent_top, system_orientation_portrait ? 0 : RelativeLayout.TRUE);
+ layoutParams.addRule(align_parent_bottom, system_orientation_portrait ? 0 : RelativeLayout.TRUE);
+ layoutParams.addRule(align_parent_left, 0);
+ layoutParams.addRule(align_parent_right, 0);
+ }
+ else {
layoutParams.addRule(align_right, R.id.top_bg);
+ layoutParams.addRule(align_bottom, 0);
+ layoutParams.addRule(align_left, 0);
+ layoutParams.addRule(align_top, 0);
layoutParams.addRule(above, 0);
layoutParams.addRule(below, R.id.top_bg);
layoutParams.addRule(left_of, 0);
layoutParams.addRule(right_of, 0);
layoutParams.addRule(align_parent_top, 0);
- layoutParams.addRule(align_parent_bottom, RelativeLayout.TRUE);
+ layoutParams.addRule(align_parent_bottom, system_orientation_portrait ? 0 : RelativeLayout.TRUE);
+ layoutParams.addRule(align_parent_left, 0);
+ layoutParams.addRule(align_parent_right, 0);
+ }
+ if( system_orientation_portrait ) {
+ // limit height so doesn't take up full height of screen
+ layoutParams.height = display_height;
}
view.setLayoutParams(layoutParams);
@@ -668,6 +816,57 @@ public class MainUI {
}
}
+ /** Wrapper for layoutParams.setMargins, but where the margins are supplied for landscape orientation,
+ * and if in portrait these are automatically rotated.
+ */
+ void setMarginsForSystemUI(RelativeLayout.LayoutParams layoutParams, int left, int top, int right, int bottom) {
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ if( system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ layoutParams.setMargins(bottom, left, top, right);
+ }
+ else if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE ) {
+ layoutParams.setMargins(right, bottom, left, top);
+ }
+ else {
+ layoutParams.setMargins(left, top, right, bottom);
+ }
+ }
+
+ /** Some views (e.g. seekbars and zoom controls) are ones where we want to have a fixed
+ * orientation as if in landscape mode, even if the system UI is portrait. So this method
+ * sets a rotation so that the view appears as if in landscape orentation, and also sets
+ * margins.
+ * Note that Android has poor support for a rotated seekbar - we use view.setRotation(), but
+ * this doesn't affect the bounds of the view! So as a hack, we modify the margins so the
+ * view is positioned correctly. For this to work, the view must have a specified width
+ * (which can be computed programmatically), rather than having both left and right sides being
+ * aligned to another view.
+ * The left/top/right/bottom margins should be supply for landscape orientation - these will
+ * be automatically rotated if we're actually in portrait orientation.
+ */
+ private void setFixedRotation(View view, int left, int top, int right, int bottom) {
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
+ int rotation = (360 - MainActivity.getRotationFromSystemOrientation(system_orientation)) % 360;
+ view.setRotation(rotation);
+ // set margins due to rotation
+ RelativeLayout.LayoutParams layoutParams = (RelativeLayout.LayoutParams)view.getLayoutParams();
+ if( system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ int diff = (layoutParams.width-layoutParams.height)/2;
+ if( MyDebug.LOG )
+ Log.d(TAG, "diff: " + diff);
+ setMarginsForSystemUI(layoutParams, +diff+left, -diff+top, +diff+right, -diff+bottom);
+ }
+ else {
+ setMarginsForSystemUI(layoutParams, left, top, right, bottom);
+ }
+ view.setLayoutParams(layoutParams);
+ }
+
+ void setFocusSeekbarsRotation() {
+ setFixedRotation(main_activity.findViewById(R.id.focus_seekbar), 0, 0, 0, 0);
+ setFixedRotation(main_activity.findViewById(R.id.focus_bracketing_target_seekbar), 0, 0, 0, 0);
+ }
+
private void setPopupViewRotation(int ui_rotation, int display_height) {
if (MyDebug.LOG)
Log.d(TAG, "setPopupViewRotation");
@@ -871,13 +1070,19 @@ public class MainUI {
}
- public void onOrientationChanged(int orientation) {
+ // ParameterCanBeLocal warning suppressed as it's incorrect here! (Or
+ // possibly it's due to effect of MainActivity.lock_to_landscape always
+ // being false.)
+ public void onOrientationChanged(@SuppressWarnings("ParameterCanBeLocal") int orientation) {
/*if( MyDebug.LOG ) {
Log.d(TAG, "onOrientationChanged()");
Log.d(TAG, "orientation: " + orientation);
Log.d(TAG, "current_orientation: " + current_orientation);
}*/
- if (orientation == OrientationEventListener.ORIENTATION_UNKNOWN)
+ if( !MainActivity.lock_to_landscape )
+ return;
+ // if locked to landscape, we need to handle the orientation change ourselves
+ if( orientation == OrientationEventListener.ORIENTATION_UNKNOWN )
return;
int diff = Math.abs(orientation - current_orientation);
if (diff > 180)
@@ -924,6 +1129,10 @@ public class MainUI {
public boolean showExposureLockIcon() {
if (!main_activity.getPreview().supportsExposureLock())
return false;
+ if( main_activity.getApplicationInterface().isCameraExtensionPref() ) {
+ // not supported for camera extensions
+ return false;
+ }
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
return sharedPreferences.getBoolean(PreferenceKeys.ShowExposureLockPreferenceKey, true);
}
@@ -931,6 +1140,10 @@ public class MainUI {
public boolean showWhiteBalanceLockIcon() {
if (!main_activity.getPreview().supportsWhiteBalanceLock())
return false;
+ if( main_activity.getApplicationInterface().isCameraExtensionPref() ) {
+ // not supported for camera extensions
+ return false;
+ }
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
return sharedPreferences.getBoolean(PreferenceKeys.ShowWhiteBalanceLockPreferenceKey, false);
}
@@ -1014,8 +1227,10 @@ public class MainUI {
View settingsButton = main_activity.findViewById(R.id.settings);
View zoomControls = main_activity.findViewById(R.id.zoom);
View zoomSeekBar = main_activity.findViewById(R.id.zoom_seekbar);
+ View focusSeekBar = main_activity.findViewById(R.id.focus_seekbar);
+ View focusBracketingTargetSeekBar = main_activity.findViewById(R.id.focus_bracketing_target_seekbar);
View zoomSeekbarIcon = main_activity.findViewById(R.id.zoom_seekbar_icon);
- if (main_activity.getPreview().getCameraControllerManager().getNumberOfCameras() > 1)
+ if( main_activity.getPreview().getCameraControllerManager().getNumberOfCameras() > 1 )
switchCameraButton.setVisibility(visibility);
if (main_activity.showSwitchMultiCamIcon())
switchMultiCameraButton.setVisibility(visibility);
@@ -1055,6 +1270,29 @@ public class MainUI {
zoomSeekBar.setVisibility(visibility);
zoomSeekbarIcon.setVisibility(visibility);
}
+ if( main_activity.showManualFocusSeekbar(false) )
+ focusSeekBar.setVisibility(visibility);
+ if( main_activity.showManualFocusSeekbar(true) )
+ focusBracketingTargetSeekBar.setVisibility(visibility);
+ String pref_immersive_mode = sharedPreferences.getString(PreferenceKeys.ImmersiveModePreferenceKey, "immersive_mode_low_profile");
+ if( pref_immersive_mode.equals("immersive_mode_everything") ) {
+ if( sharedPreferences.getBoolean(PreferenceKeys.ShowTakePhotoPreferenceKey, true) ) {
+ View takePhotoButton = main_activity.findViewById(R.id.take_photo);
+ takePhotoButton.setVisibility(visibility);
+ }
+ if( Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && main_activity.getPreview().isVideoRecording() ) {
+ View pauseVideoButton = main_activity.findViewById(R.id.pause_video);
+ pauseVideoButton.setVisibility(visibility);
+ }
+ if( main_activity.getPreview().supportsPhotoVideoRecording() && main_activity.getApplicationInterface().usePhotoVideoRecording() && main_activity.getPreview().isVideoRecording() ) {
+ View takePhotoVideoButton = main_activity.findViewById(R.id.take_photo_when_video_recording);
+ takePhotoVideoButton.setVisibility(visibility);
+ }
+ if( main_activity.getApplicationInterface().getGyroSensor().isRecording() ) {
+ View cancelPanoramaButton = main_activity.findViewById(R.id.cancel_panorama);
+ cancelPanoramaButton.setVisibility(visibility);
+ }
+ }
if (!immersive_mode) {
// make sure the GUI is set up as expected
showGUI();
@@ -1309,7 +1547,8 @@ public class MainUI {
mSelectingExposureUIElement = false;
if (isExposureUIOpen()) {
closeExposureUI();
- } else if (main_activity.getPreview().getCameraController() != null) {
+ }
+ else if( main_activity.getPreview().getCameraController() != null && main_activity.supportsExposureButton() ) {
setupExposureUI();
if (main_activity.getBluetoothRemoteControl().remoteEnabled()) {
initRemoteControlForExposureUI();
@@ -1496,23 +1735,23 @@ public class MainUI {
int count = iso_buttons.size();
int step = previous ? -1 : 1;
boolean found = false;
- for (int i = 0; i < count; i++) {
- Button button = (Button) iso_buttons.get(i);
+ for(int i = 0; i < count; i++) {
+ Button button = (Button)iso_buttons.get(i);
String button_text = "" + button.getText();
- if (button_text.contains(current_iso)) {
+ if( ISOTextEquals(button_text, current_iso) ) {
found = true;
// Select next one, unless it's "Manual", which we skip since
// it's not practical in remote mode.
Button nextButton = (Button) iso_buttons.get((i + count + step) % count);
String nextButton_text = "" + nextButton.getText();
- if (nextButton_text.contains("m")) {
- nextButton = (Button) iso_buttons.get((i + count + 2 * step) % count);
+ if( nextButton_text.contains("m") ) {
+ nextButton = (Button) iso_buttons.get((i+count+ 2*step)%count);
}
nextButton.callOnClick();
break;
}
}
- if (!found) {
+ if( !found ) {
// For instance, we are in ISO manual mode and "M" is selected. default
// back to "Auto" to avoid being stuck since we're with a remote control
iso_buttons.get(0).callOnClick();
@@ -1530,11 +1769,11 @@ public class MainUI {
private void selectExposureUILine() {
if (MyDebug.LOG)
Log.d(TAG, "selectExposureUILine");
- if (!isExposureUIOpen()) { // Safety check
+ if( !isExposureUIOpen() ) { // Safety check
return;
}
- if (mExposureLine == 0) { // ISO presets
+ if( mExposureLine == 0 ) { // ISO presets
ViewGroup iso_buttons_container = main_activity.findViewById(R.id.iso_buttons);
iso_buttons_container.setBackgroundColor(highlightColorExposureUIElement);
//iso_buttons_container.setAlpha(1f);
@@ -1546,45 +1785,50 @@ public class MainUI {
for (View view : iso_buttons) {
Button button = (Button) view;
String button_text = "" + button.getText();
- if (button_text.contains(current_iso)) {
+ if( ISOTextEquals(button_text, current_iso) ) {
PopupView.setButtonSelected(button, true);
//button.setBackgroundColor(highlightColorExposureUIElement);
//button.setAlpha(0.3f);
found = true;
- } else {
- if (button_text.contains("m")) {
+ }
+ else {
+ if( button_text.contains("m") ) {
manualButton = button;
}
PopupView.setButtonSelected(button, false);
button.setBackgroundColor(Color.TRANSPARENT);
}
}
- if (!found && manualButton != null) {
+ if( !found && manualButton != null ) {
// We are in manual ISO, highlight the "M" button
PopupView.setButtonSelected(manualButton, true);
manualButton.setBackgroundColor(highlightColorExposureUIElement);
//manualButton.setAlpha(0.3f);
}
mSelectingExposureUIElement = true;
- } else if (mExposureLine == 1) {
+ }
+ else if( mExposureLine == 1 ) {
// ISO seek bar - change color
View seek_bar = main_activity.findViewById(R.id.iso_seekbar);
//seek_bar.setAlpha(0.1f);
seek_bar.setBackgroundColor(highlightColorExposureUIElement);
mSelectingExposureUIElement = true;
- } else if (mExposureLine == 2) {
+ }
+ else if( mExposureLine == 2 ) {
// ISO seek bar - change color
View seek_bar = main_activity.findViewById(R.id.exposure_time_seekbar);
//seek_bar.setAlpha(0.1f);
seek_bar.setBackgroundColor(highlightColorExposureUIElement);
mSelectingExposureUIElement = true;
- } else if (mExposureLine == 3) {
+ }
+ else if ( mExposureLine == 3 ) {
// Exposure compensation
View container = main_activity.findViewById(R.id.exposure_container);
//container.setAlpha(0.1f);
container.setBackgroundColor(highlightColorExposureUIElement);
mSelectingExposureUIElement = true;
- } else if (mExposureLine == 4) {
+ }
+ else if( mExposureLine == 4 ) {
// Manual white balance
View container = main_activity.findViewById(R.id.white_balance_seekbar);
//container.setAlpha(0.1f);
@@ -1713,14 +1957,14 @@ public class MainUI {
values.add(CameraController.ISO_DEFAULT);
values.add(manual_iso_value);
iso_button_manual_index = 1; // must match where we place the manual button!
- int[] iso_values = {50, 100, 200, 400, 800, 1600, 3200, 6400};
- values.add("" + min_iso);
- for (int iso_value : iso_values) {
- if (iso_value > min_iso && iso_value < max_iso) {
- values.add("" + iso_value);
+ int [] iso_values = {50, 100, 200, 400, 800, 1600, 3200, 6400};
+ values.add(ISOToButtonText(min_iso));
+ for(int iso_value : iso_values) {
+ if( iso_value > min_iso && iso_value < max_iso ) {
+ values.add(ISOToButtonText(iso_value));
}
}
- values.add("" + max_iso);
+ values.add(ISOToButtonText(max_iso));
supported_isos = values;
} else {
supported_isos = preview.getSupportedISOs();
@@ -1760,9 +2004,10 @@ public class MainUI {
editor.putLong(PreferenceKeys.ExposureTimePreferenceKey, CameraController.EXPOSURE_TIME_DEFAULT);
editor.apply();
preview.showToast("ISO: " + toast_option, 0, true); // supply offset_y_dp to be consistent with preview.setExposure(), preview.setISO()
- main_activity.updateForSettings(""); // already showed the toast, so block from showing again
- } else if (old_iso.equals(CameraController.ISO_DEFAULT)) {
- if (MyDebug.LOG)
+ main_activity.updateForSettings(true, ""); // already showed the toast, so block from showing again
+ }
+ else if( old_iso.equals(CameraController.ISO_DEFAULT) ) {
+ if( MyDebug.LOG )
Log.d(TAG, "switched from auto to manual iso");
if (option.equals("m")) {
// if we used the generic "manual", then instead try to preserve the current iso if it exists
@@ -1795,9 +2040,10 @@ public class MainUI {
editor.apply();
preview.showToast("ISO: " + toast_option, 0, true); // supply offset_y_dp to be consistent with preview.setExposure(), preview.setISO()
- main_activity.updateForSettings(""); // already showed the toast, so block from showing again
- } else {
- if (MyDebug.LOG)
+ main_activity.updateForSettings(true, ""); // already showed the toast, so block from showing again
+ }
+ else {
+ if( MyDebug.LOG )
Log.d(TAG, "changed manual iso");
if (option.equals("m")) {
// if user selected the generic "manual", then just keep the previous non-ISO option
@@ -1876,9 +2122,30 @@ public class MainUI {
//layoutUI(); // needed to update alignment of exposure UI
}
- /**
- * If the exposure panel is open, updates the selected ISO button to match the current ISO value,
- * if a continuous range of ISO values are supported by the camera.
+ /** Returns whether the ISO button with the supplied text is a match for the supplied iso.
+ * Should only be used for Preview.supportsISORange()==true (i.e., full manual ISO).
+ */
+ public static boolean ISOTextEquals(String button_text, String iso) {
+ // Can't use equals(), due to the \n that Popupview.getButtonOptionString() inserts, and
+ // also good to make this general in case in future we support other text formats.
+ // We really want to check that iso is the last word in button_text.
+ if( button_text.endsWith(iso) ) {
+ return button_text.length()==iso.length() || Character.isWhitespace( button_text.charAt(button_text.length()-iso.length()-1) );
+ }
+ return false;
+ }
+
+ /** Returns the ISO button text for the supplied iso.
+ * Should only be used for Preview.supportsISORange()==true (i.e., full manual ISO).
+ */
+ public static String ISOToButtonText(int iso) {
+ // n.b., if we change how the ISO is converted to a string for the button, will also need
+ // to update updateSelectedISOButton()
+ return "" + iso;
+ }
+
+ /** If the exposure panel is open, updates the selected ISO button to match the current ISO value,
+ * if a continuous range of ISO values are supported by the camera.
*/
public void updateSelectedISOButton() {
if (MyDebug.LOG)
@@ -1896,7 +2163,7 @@ public class MainUI {
if (MyDebug.LOG)
Log.d(TAG, "button: " + button.getText());
String button_text = "" + button.getText();
- if (button_text.contains(current_iso)) {
+ if( ISOTextEquals(button_text, current_iso) ) {
PopupView.setButtonSelected(button, true);
found = true;
} else {
@@ -2259,25 +2526,52 @@ public class MainUI {
}
UIPlacement ui_placement = computeUIPlacement();
+ MainActivity.SystemOrientation system_orientation = main_activity.getSystemOrientation();
float pivot_x;
float pivot_y;
switch (ui_placement) {
case UIPLACEMENT_TOP:
- if (main_activity.getPreview().getUIRotation() == 270) {
+ if( main_activity.getPreview().getUIRotation() == 270 ) {
+ // portrait (when not locked)
pivot_x = 0.0f;
pivot_y = 1.0f;
- } else {
+ }
+ else if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE ) {
+ pivot_x = 1.0f;
+ pivot_y = 1.0f;
+ }
+ else {
pivot_x = 0.0f;
pivot_y = 0.0f;
}
break;
case UIPLACEMENT_LEFT:
- pivot_x = 1.0f;
- pivot_y = 1.0f;
+ if( system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ pivot_x = 0.0f;
+ pivot_y = 1.0f;
+ }
+ else if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE ) {
+ pivot_x = 0.0f;
+ pivot_y = 0.0f;
+ }
+ else {
+ pivot_x = 1.0f;
+ pivot_y = 1.0f;
+ }
break;
default:
- pivot_x = 1.0f;
- pivot_y = 0.0f;
+ if( system_orientation == MainActivity.SystemOrientation.PORTRAIT ) {
+ pivot_x = 1.0f;
+ pivot_y = 1.0f;
+ }
+ else if( system_orientation == MainActivity.SystemOrientation.REVERSE_LANDSCAPE ) {
+ pivot_x = 0.0f;
+ pivot_y = 1.0f;
+ }
+ else {
+ pivot_x = 1.0f;
+ pivot_y = 0.0f;
+ }
break;
}
ScaleAnimation animation = new ScaleAnimation(0.0f, 1.0f, 0.0f, 1.0f, Animation.RELATIVE_TO_SELF, pivot_x, Animation.RELATIVE_TO_SELF, pivot_y);
@@ -2403,17 +2697,18 @@ public class MainUI {
if (main_activity.getPreview().getCameraController() != null) {
String value = sharedPreferences.getString(PreferenceKeys.ISOPreferenceKey, CameraController.ISO_DEFAULT);
boolean manual_iso = !value.equals(CameraController.ISO_DEFAULT);
- if (keyCode == KeyEvent.KEYCODE_VOLUME_UP) {
- if (manual_iso) {
- if (main_activity.getPreview().supportsISORange())
- main_activity.changeISO(1);
- } else
+ if(keyCode == KeyEvent.KEYCODE_VOLUME_UP) {
+ if(manual_iso) {
+ main_activity.changeISO(1);
+ }
+ else
main_activity.changeExposure(1);
- } else {
- if (manual_iso) {
- if (main_activity.getPreview().supportsISORange())
- main_activity.changeISO(-1);
- } else
+ }
+ else {
+ if(manual_iso) {
+ main_activity.changeISO(-1);
+ }
+ else
main_activity.changeExposure(-1);
}
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/ManualSeekbars.java b/app/src/main/java/net/sourceforge/opencamera/ui/ManualSeekbars.java
index 148a296d361bd67413a128bc8ef121f3da0609ff..9a48bcab3ea0ea5b43cf1bcd75b81b145e1d8a72 100644
--- a/app/src/main/java/net/sourceforge/opencamera/ui/ManualSeekbars.java
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/ManualSeekbars.java
@@ -177,21 +177,21 @@ public class ManualSeekbars {
// 1/10,000 to 1/1,000
for(int i=10;i>=1;i--) {
- long exposure = 1000000000L/(i*1000);
+ long exposure = 1000000000L/(i* 1000L);
if( exposure > min_exposure_time && exposure < max_exposure_time )
seekbar_values.add(exposure);
}
// 1/900 to 1/100
for(int i=9;i>=1;i--) {
- long exposure = 1000000000L/(i*100);
+ long exposure = 1000000000L/(i* 100L);
if( exposure > min_exposure_time && exposure < max_exposure_time )
seekbar_values.add(exposure);
}
// 1/90 to 1/60 (steps of 10)
for(int i=9;i>=6;i--) {
- long exposure = 1000000000L/(i*10);
+ long exposure = 1000000000L/(i* 10L);
if( exposure > min_exposure_time && exposure < max_exposure_time )
seekbar_values.add(exposure);
}
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/MyEditTextPreference.java b/app/src/main/java/net/sourceforge/opencamera/ui/MyEditTextPreference.java
new file mode 100644
index 0000000000000000000000000000000000000000..8a27d3c5e91ddf5d486ae70fb997b8b016b298fd
--- /dev/null
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/MyEditTextPreference.java
@@ -0,0 +1,158 @@
+package net.sourceforge.opencamera.ui;
+
+import android.content.Context;
+import android.content.res.TypedArray;
+import android.os.Parcel;
+import android.os.Parcelable;
+import android.preference.DialogPreference;
+import android.text.TextUtils;
+import android.util.AttributeSet;
+import android.view.View;
+import android.view.inputmethod.EditorInfo;
+import android.widget.EditText;
+import android.widget.TextView;
+
+import foundation.e.camera.R;
+
+
+/** This contains a custom preference for an EditTextPreference. We do all this to fix the problem
+ * that Android's EditTextPreference doesn't satisfy Google's own emoji policy, due to the
+ * programmatically allocated EditText (which means AppCompat can't update it to support emoji
+ * properly). This is fixed with AndroidX (androidx.preference.*), but switching to that is a major
+ * change.
+ * Once we have switched to AndroidX's preference libraries, we can switch back to
+ * EditTextPreference (but check that the emoji strings still work on Android 10 or earlier!)
+ */
+public class MyEditTextPreference extends DialogPreference {
+ //private static final String TAG = "MyEditTextPreference";
+
+ private EditText edittext;
+
+ private String dialogMessage = "";
+ private final int inputType;
+
+ private String value; // current saved value of this preference (note that this is intentionally not updated when the seekbar changes, as we don't save until the user clicks ok)
+ private boolean value_set;
+
+ public MyEditTextPreference(Context context, AttributeSet attrs) {
+ super(context, attrs);
+
+ String namespace = "http://schemas.android.com/apk/res/android";
+
+ // can't get both strings and resources to work - only support resources
+ int id = attrs.getAttributeResourceValue(namespace, "dialogMessage", 0);
+ if( id > 0 )
+ this.dialogMessage = context.getString(id);
+
+ this.inputType = attrs.getAttributeIntValue(namespace, "inputType", EditorInfo.TYPE_NULL);
+
+ setDialogLayoutResource(R.layout.myedittextpreference);
+ }
+
+ @Override
+ protected void onBindDialogView(View view) {
+ super.onBindDialogView(view);
+
+ this.edittext = view.findViewById(R.id.myedittextpreference_edittext);
+ this.edittext.setInputType(inputType);
+
+ TextView textView = view.findViewById(R.id.myedittextpreference_summary);
+ textView.setText(dialogMessage);
+
+ if( value != null ) {
+ this.edittext.setText(value);
+ }
+ }
+
+ @Override
+ protected void onDialogClosed(boolean positiveResult) {
+ super.onDialogClosed(positiveResult);
+
+ if( positiveResult ) {
+ String new_value = edittext.getText().toString();
+ if( callChangeListener(new_value) ) {
+ setValue(new_value);
+ }
+ }
+ }
+
+ public String getText() {
+ return value;
+ }
+
+ private void setValue(String value) {
+ final boolean changed = !TextUtils.equals(this.value, value);
+ if( changed || !value_set ) {
+ this.value = value;
+ value_set = true;
+ persistString(value);
+ if( changed ) {
+ notifyChanged();
+ }
+ }
+ }
+
+ @Override
+ protected Object onGetDefaultValue(TypedArray a, int index) {
+ return a.getString(index);
+ }
+
+ @Override
+ protected void onSetInitialValue(boolean restoreValue, Object defaultValue) {
+ setValue(restoreValue ? getPersistedString(value) : (String) defaultValue);
+ }
+
+ @Override
+ protected Parcelable onSaveInstanceState() {
+ final Parcelable superState = super.onSaveInstanceState();
+ if( isPersistent() ) {
+ return superState;
+ }
+
+ final SavedState state = new SavedState(superState);
+ state.value = value;
+ return state;
+ }
+
+ @Override
+ protected void onRestoreInstanceState(Parcelable state) {
+ if( state == null || !state.getClass().equals(SavedState.class) ) {
+ super.onRestoreInstanceState(state);
+ return;
+ }
+
+ SavedState myState = (SavedState)state;
+ super.onRestoreInstanceState(myState.getSuperState());
+ setValue(myState.value);
+ }
+
+ private static class SavedState extends BaseSavedState {
+ String value;
+
+ SavedState(Parcel source) {
+ super(source);
+ value = source.readString();
+ }
+
+ @Override
+ public void writeToParcel(Parcel dest, int flags) {
+ super.writeToParcel(dest, flags);
+ dest.writeString(value);
+ }
+
+ SavedState(Parcelable superState) {
+ super(superState);
+ }
+
+ public static final Parcelable.Creator CREATOR =
+ new Parcelable.Creator() {
+ public SavedState createFromParcel(Parcel in) {
+ return new SavedState(in);
+ }
+
+ public SavedState[] newArray(int size) {
+ return new SavedState[size];
+ }
+ };
+ }
+}
diff --git a/app/src/main/java/net/sourceforge/opencamera/ui/PopupView.java b/app/src/main/java/net/sourceforge/opencamera/ui/PopupView.java
index 875dc0385ae31e6b52e5fbab282aa8fa2811584b..8100aa0f0f2eba41ea0e4673cf6b5c6c56dc6f17 100644
--- a/app/src/main/java/net/sourceforge/opencamera/ui/PopupView.java
+++ b/app/src/main/java/net/sourceforge/opencamera/ui/PopupView.java
@@ -15,17 +15,20 @@ import java.util.Collections;
import java.util.List;
import java.util.Map;
+import android.annotation.SuppressLint;
import android.content.Context;
import android.content.SharedPreferences;
import android.graphics.Bitmap;
import android.graphics.Color;
import android.graphics.Typeface;
+import android.hardware.camera2.CameraExtensionCharacteristics;
import android.os.Build;
import android.os.Handler;
import android.preference.PreferenceManager;
import android.util.Log;
import android.util.TypedValue;
import android.view.Gravity;
+import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.view.ViewTreeObserver.OnGlobalLayoutListener;
@@ -38,10 +41,11 @@ import android.widget.LinearLayout;
import android.widget.RadioButton;
import android.widget.RadioGroup;
import android.widget.ScrollView;
-import android.widget.Switch;
import android.widget.TextView;
import android.widget.ImageView.ScaleType;
+import androidx.appcompat.widget.SwitchCompat;
+
/** This defines the UI for the "popup" button, that provides quick access to a
* range of options.
*/
@@ -114,8 +118,10 @@ public class PopupView extends LinearLayout {
}*/
final Preview preview = main_activity.getPreview();
+ boolean is_camera_extension = main_activity.getApplicationInterface().isCameraExtensionPref();
if( MyDebug.LOG )
Log.d(TAG, "PopupView time 2: " + (System.nanoTime() - debug_time));
+
if( !main_activity.getMainUI().showCycleFlashIcon() )
{
List supported_flash_values = preview.getSupportedFlashValues();
@@ -214,6 +220,26 @@ public class PopupView extends LinearLayout {
photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_focus_bracketing_full : R.string.photo_mode_focus_bracketing) );
photo_mode_values.add( MyApplicationInterface.PhotoMode.FocusBracketing );
}
+ if( main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_AUTOMATIC) ) {
+ photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_x_auto_full : R.string.photo_mode_x_auto) );
+ photo_mode_values.add( MyApplicationInterface.PhotoMode.X_Auto );
+ }
+ if( main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_HDR) ) {
+ photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_x_hdr_full : R.string.photo_mode_x_hdr) );
+ photo_mode_values.add( MyApplicationInterface.PhotoMode.X_HDR );
+ }
+ if( main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_NIGHT) ) {
+ photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_x_night_full : R.string.photo_mode_x_night) );
+ photo_mode_values.add( MyApplicationInterface.PhotoMode.X_Night );
+ }
+ if( main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_BOKEH) ) {
+ photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_x_bokeh_full : R.string.photo_mode_x_bokeh) );
+ photo_mode_values.add( MyApplicationInterface.PhotoMode.X_Bokeh );
+ }
+ if( main_activity.supportsCameraExtension(CameraExtensionCharacteristics.EXTENSION_BEAUTY) ) {
+ photo_modes.add( getResources().getString(use_expanded_menu ? R.string.photo_mode_x_beauty_full : R.string.photo_mode_x_beauty) );
+ photo_mode_values.add( MyApplicationInterface.PhotoMode.X_Beauty );
+ }
if( preview.isVideo() ) {
// only show photo modes when in photo mode, not video mode!
// (photo modes not supported for photo snapshot whilst recording video)
@@ -383,7 +409,7 @@ public class PopupView extends LinearLayout {
public void run() {
if (MyDebug.LOG)
Log.d(TAG, "update settings due to resolution change");
- main_activity.updateForSettings("", true); // keep the popupview open
+ main_activity.updateForSettings(true, "", true); // keep the popupview open
}
};
@@ -398,8 +424,11 @@ public class PopupView extends LinearLayout {
editor.apply();
// make it easier to scroll through the list of resolutions without a pause each time
+ // need a longer time for extension modes, due to the need to camera reopening (which will cause the
+ // popup menu to close)
+ final long delay_time = main_activity.getApplicationInterface().isCameraExtensionPref() ? 800 : 400;
handler.removeCallbacks(update_runnable);
- handler.postDelayed(update_runnable, 400);
+ handler.postDelayed(update_runnable, delay_time);
}
@Override
@@ -464,7 +493,7 @@ public class PopupView extends LinearLayout {
public void run() {
if( MyDebug.LOG )
Log.d(TAG, "update settings due to video resolution change");
- main_activity.updateForSettings("", true); // keep the popupview open
+ main_activity.updateForSettings(true, "", true); // keep the popupview open
}
};
@@ -506,7 +535,8 @@ public class PopupView extends LinearLayout {
if( MyDebug.LOG )
Log.d(TAG, "PopupView time 10: " + (System.nanoTime() - debug_time));
- if( preview.getSupportedApertures() != null ) {
+ // apertures probably not supported for camera extensions anyway
+ if( preview.getSupportedApertures() != null && !is_camera_extension ) {
if( MyDebug.LOG )
Log.d(TAG, "add apertures");
@@ -698,9 +728,12 @@ public class PopupView extends LinearLayout {
}
});
- //CheckBox checkBox = new CheckBox(main_activity);
- Switch checkBox = new Switch(main_activity);
+ @SuppressLint("InflateParams")
+ final View switch_view = LayoutInflater.from(context).inflate(R.layout.popupview_switch, null);
+ final SwitchCompat checkBox = switch_view.findViewById(R.id.popupview_switch);
+
checkBox.setText(getResources().getString(R.string.focus_bracketing_add_infinity));
+
{
// align the checkbox a bit better
checkBox.setGravity(Gravity.RIGHT);
@@ -771,8 +804,8 @@ public class PopupView extends LinearLayout {
@Override
public void run() {
if (MyDebug.LOG)
- Log.d(TAG, "update settings due to resolution change");
- main_activity.updateForSettings("", true); // keep the popupview open
+ Log.d(TAG, "update settings due to video capture rate change");
+ main_activity.updateForSettings(true, "", true); // keep the popupview open
}
};
@@ -812,12 +845,12 @@ public class PopupView extends LinearLayout {
old_video_capture_rate_index = video_capture_rate_index;
if( keep_popup ) {
- // make it easier to scroll through the list of resolutions without a pause each time
+ // make it easier to scroll through the list of capture rates without a pause each time
handler.removeCallbacks(update_runnable);
handler.postDelayed(update_runnable, 400);
}
else {
- main_activity.updateForSettings(toast_message, keep_popup);
+ main_activity.updateForSettings(true, toast_message, keep_popup);
}
}
@Override
@@ -981,8 +1014,10 @@ public class PopupView extends LinearLayout {
if( MyDebug.LOG )
Log.d(TAG, "PopupView time 13: " + (System.nanoTime() - debug_time));
+ // white balance modes, scene modes, color effects
+ // all of these are only supported when not using extension mode
// popup should only be opened if we have a camera controller, but check just to be safe
- if( preview.getCameraController() != null ) {
+ if( preview.getCameraController() != null && !is_camera_extension ) {
List supported_white_balances = preview.getSupportedWhiteBalances();
List supported_white_balances_entries = null;
if( supported_white_balances != null ) {
@@ -1016,7 +1051,7 @@ public class PopupView extends LinearLayout {
if( preview.getCameraController() != null ) {
if( preview.getCameraController().sceneModeAffectsFunctionality() ) {
// need to call updateForSettings() and close the popup, as changing scene mode can change available camera features
- main_activity.updateForSettings(getResources().getString(R.string.scene_mode) + ": " + main_activity.getMainUI().getEntryForSceneMode(selected_value));
+ main_activity.updateForSettings(true, getResources().getString(R.string.scene_mode) + ": " + main_activity.getMainUI().getEntryForSceneMode(selected_value));
main_activity.closePopup();
}
else {
@@ -1054,6 +1089,9 @@ public class PopupView extends LinearLayout {
}
setBackgroundColor(getResources().getColor(R.color.color_popup_bg));
+
+ if( MyDebug.LOG )
+ Log.d(TAG, "Overall PopupView time: " + (System.nanoTime() - debug_time));
}
int getTotalWidth() {
@@ -1099,6 +1137,21 @@ public class PopupView extends LinearLayout {
case Panorama:
toast_message = getResources().getString(R.string.photo_mode_panorama_full);
break;
+ case X_Auto:
+ toast_message = getResources().getString(R.string.photo_mode_x_auto_full);
+ break;
+ case X_HDR:
+ toast_message = getResources().getString(R.string.photo_mode_x_hdr_full);
+ break;
+ case X_Night:
+ toast_message = getResources().getString(R.string.photo_mode_x_night_full);
+ break;
+ case X_Bokeh:
+ toast_message = getResources().getString(R.string.photo_mode_x_bokeh_full);
+ break;
+ case X_Beauty:
+ toast_message = getResources().getString(R.string.photo_mode_x_beauty_full);
+ break;
}
final SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(main_activity);
SharedPreferences.Editor editor = sharedPreferences.edit();
@@ -1127,6 +1180,21 @@ public class PopupView extends LinearLayout {
case Panorama:
editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_panorama");
break;
+ case X_Auto:
+ editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_x_auto");
+ break;
+ case X_HDR:
+ editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_x_hdr");
+ break;
+ case X_Night:
+ editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_x_night");
+ break;
+ case X_Bokeh:
+ editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_x_bokeh");
+ break;
+ case X_Beauty:
+ editor.putString(PreferenceKeys.PhotoModePreferenceKey, "preference_photo_mode_x_beauty");
+ break;
default:
if (MyDebug.LOG)
Log.e(TAG, "unknown new_photo_mode: " + new_photo_mode);
@@ -1156,7 +1224,7 @@ public class PopupView extends LinearLayout {
}
main_activity.getApplicationInterface().getDrawPreview().updateSettings(); // because we cache the photomode
- main_activity.updateForSettings(toast_message); // need to setup the camera again, as options may change (e.g., required burst mode, or whether RAW is allowed in this mode)
+ main_activity.updateForSettings(true, toast_message); // need to setup the camera again, as options may change (e.g., required burst mode, or whether RAW is allowed in this mode)
main_activity.getMainUI().destroyPopup(); // need to recreate popup for new selection
}
}
@@ -1228,6 +1296,10 @@ public class PopupView extends LinearLayout {
createButtonOptions(this, this.getContext(), total_width_dp, main_activity.getMainUI().getTestUIButtonsMap(), supported_options, icons_id, values_id, prefix_string, true, current_value, max_buttons_per_row, test_key, listener);
}
+ public static String getButtonOptionString(boolean include_prefix, String prefix_string, String supported_option) {
+ return (include_prefix ? prefix_string : "") + "\n" + supported_option;
+ }
+
static List createButtonOptions(ViewGroup parent, Context context, int total_width_dp, Map test_ui_buttons, List supported_options, int icons_id, int values_id, String prefix_string, boolean include_prefix, String current_value, int max_buttons_per_row, String test_key, final ButtonOptionsPopupListener listener) {
if( MyDebug.LOG )
Log.d(TAG, "createButtonOptions");
@@ -1338,13 +1410,13 @@ public class PopupView extends LinearLayout {
button_string = supported_option;
}
else if( prefix_string.equalsIgnoreCase("ISO") && supported_option.length() >= 4 && supported_option.substring(0, 4).equalsIgnoreCase("ISO_") ) {
- button_string = (include_prefix ? prefix_string : "") + "\n" + supported_option.substring(4);
+ button_string = getButtonOptionString(include_prefix, prefix_string, supported_option.substring(4));
}
else if( prefix_string.equalsIgnoreCase("ISO") && supported_option.length() >= 3 && supported_option.substring(0, 3).equalsIgnoreCase("ISO") ) {
- button_string = (include_prefix ? prefix_string : "") + "\n" + supported_option.substring(3);
+ button_string = getButtonOptionString(include_prefix, prefix_string, supported_option.substring(3));
}
else {
- button_string = (include_prefix ? prefix_string : "") + "\n" + supported_option;
+ button_string = getButtonOptionString(include_prefix, prefix_string, supported_option);
}
if( MyDebug.LOG )
Log.d(TAG, "button_string: " + button_string);
@@ -1378,7 +1450,10 @@ public class PopupView extends LinearLayout {
view.setPadding(imageButtonPadding, imageButtonPadding, imageButtonPadding, imageButtonPadding);
}
else {
- Button button = new Button(context);
+ @SuppressLint("InflateParams")
+ final View button_view = LayoutInflater.from(context).inflate(R.layout.popupview_button, null);
+ final Button button = button_view.findViewById(R.id.button);
+
button.setBackgroundColor(Color.TRANSPARENT); // workaround for Android 6 crash! Also looks nicer anyway...
view = button;
buttons.add(view);
@@ -1480,16 +1555,21 @@ public class PopupView extends LinearLayout {
}
private void addTitleToPopup(final String title) {
- TextView text_view = new TextView(this.getContext());
+ final long debug_time = System.nanoTime();
+
+ @SuppressLint("InflateParams")
+ final View view = LayoutInflater.from(this.getContext()).inflate(R.layout.popupview_textview, null);
+ final TextView text_view = view.findViewById(R.id.text_view);
+
text_view.setText(title + ":");
- text_view.setTextColor(Color.WHITE);
- text_view.setGravity(Gravity.CENTER);
text_view.setTextSize(TypedValue.COMPLEX_UNIT_DIP, title_text_size_dip);
text_view.setTypeface(null, Typeface.BOLD);
text_view.setPadding(0, 5, 0, 5);
//text_view.setBackgroundColor(Color.GRAY); // debug
text_view.setBackgroundColor(getResources().getColor(R.color.color_popup_title_bg));
this.addView(text_view);
+ if( MyDebug.LOG )
+ Log.d(TAG, "addTitleToPopup time: " + (System.nanoTime() - debug_time));
}
private abstract static class RadioOptionsListener {
@@ -1659,7 +1739,16 @@ public class PopupView extends LinearLayout {
}
if( MyDebug.LOG )
Log.d(TAG, "addRadioOptionsToGroup time 1: " + (System.nanoTime() - debug_time));
- RadioButton button = new RadioButton(this.getContext());
+
+ // Inflating from XML made opening the radio button sub-menus much slower on old devices (e.g., Galaxy Nexus),
+ // however testing showed this is also just as slow if we programmatically create a new AppCompatRadioButton().
+ // I.e., the slowdown is due to using AppCompatRadioButton (which AppCompat will automatically use if creating
+ // a RadioButton from XML) rather than inflating from XML.
+ // Whilst creating a new RadioButton() was faster, we can't do that anymore due to emoji policy!
+ @SuppressLint("InflateParams")
+ final View view = LayoutInflater.from(this.getContext()).inflate(R.layout.popupview_radiobutton, null);
+ final RadioButton button = view.findViewById(R.id.popupview_radiobutton);
+
if( MyDebug.LOG )
Log.d(TAG, "addRadioOptionsToGroup time 2: " + (System.nanoTime() - debug_time));
@@ -1703,7 +1792,7 @@ public class PopupView extends LinearLayout {
listener.onClick(supported_option_value);
}
else {
- main_activity.updateForSettings(title + ": " + supported_option_entry);
+ main_activity.updateForSettings(true, title + ": " + supported_option_entry);
main_activity.closePopup();
}
}
@@ -1749,25 +1838,18 @@ public class PopupView extends LinearLayout {
}
final MainActivity main_activity = (MainActivity)this.getContext();
- /*final Button prev_button = new Button(this.getContext());
- //prev_button.setBackgroundResource(R.drawable.exposure);
- prev_button.setBackgroundColor(Color.TRANSPARENT); // workaround for Android 6 crash!
- prev_button.setText("<");
- this.addView(prev_button);*/
- LinearLayout ll2 = new LinearLayout(this.getContext());
- ll2.setOrientation(LinearLayout.HORIZONTAL);
- if( !title_in_options ) {
- ll2.setPadding(0,40,0, 0);
- }
+ final long debug_time = System.nanoTime();
+ @SuppressLint("InflateParams")
+ final View ll2 = LayoutInflater.from(this.getContext()).inflate(R.layout.popupview_arrayoptions, null);
+ final TextView text_view = ll2.findViewById(R.id.text_view);
+ final ImageButton prev_button = ll2.findViewById(R.id.button_left);
+ final ImageButton next_button = ll2.findViewById(R.id.button_right);
- final TextView text_view = new TextView(this.getContext());
setArrayOptionsText(supported_options, title, text_view, title_in_options, title_in_options_first_only, current_index);
//text_view.setBackgroundColor(Color.GRAY); // debug
text_view.setTextSize(TypedValue.COMPLEX_UNIT_DIP, standard_text_size_dip);
- text_view.setTextColor(Color.WHITE);
- text_view.setGravity(Gravity.CENTER);
text_view.setSingleLine(true); // if text too long for the button, we'd rather not have wordwrap, even if it means cutting some text off
LinearLayout.LayoutParams params = new LinearLayout.LayoutParams(LinearLayout.LayoutParams.WRAP_CONTENT, LinearLayout.LayoutParams.WRAP_CONTENT, 1.0f);
// Yuck! We want the arrow_button_w to be fairly large so that users can touch the arrow buttons easily, but if
@@ -1779,9 +1861,7 @@ public class PopupView extends LinearLayout {
final float scale = getResources().getDisplayMetrics().density;
final int padding = (int) (0 * scale + 0.5f); // convert dps to pixels
- final ImageButton prev_button = new ImageButton(this.getContext());
prev_button.setBackgroundColor(Color.TRANSPARENT); // workaround for Android 6 crash!
- ll2.addView(prev_button);
prev_button.setImageDrawable(this.getResources().getDrawable(R.drawable.ic_arrow_left));
prev_button.setPadding(padding, padding, padding, padding);
ViewGroup.LayoutParams vg_params = prev_button.getLayoutParams();
@@ -1792,12 +1872,10 @@ public class PopupView extends LinearLayout {
prev_button.setContentDescription( getResources().getString(R.string.previous) + " " + title);
main_activity.getMainUI().getTestUIButtonsMap().put(test_key + "_PREV", prev_button);
- ll2.addView(text_view);
+ //ll2.addView(text_view);
main_activity.getMainUI().getTestUIButtonsMap().put(test_key, text_view);
- final ImageButton next_button = new ImageButton(this.getContext());
next_button.setBackgroundColor(Color.TRANSPARENT); // workaround for Android 6 crash!
- ll2.addView(next_button);
next_button.setImageDrawable(this.getResources().getDrawable(R.drawable.ic_arrow_right));
next_button.setPadding(padding, padding, padding, padding);
vg_params = next_button.getLayoutParams();
@@ -1838,6 +1916,9 @@ public class PopupView extends LinearLayout {
});
this.addView(ll2);
+
+ if( MyDebug.LOG )
+ Log.d(TAG, "addArrayOptionsToPopup time: " + (System.nanoTime() - debug_time));
}
}
}
diff --git a/app/src/main/res/drawable-hdpi/baseline_bedtime_white_48.png b/app/src/main/res/drawable-hdpi/baseline_bedtime_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..8f775b967672071173d9d74319596563202c2cf4
Binary files /dev/null and b/app/src/main/res/drawable-hdpi/baseline_bedtime_white_48.png differ
diff --git a/app/src/main/res/drawable-hdpi/baseline_delete_white_48.png b/app/src/main/res/drawable-hdpi/baseline_delete_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..aadf7229ee75795998b33c2dd239d0a2c4cc69e6
Binary files /dev/null and b/app/src/main/res/drawable-hdpi/baseline_delete_white_48.png differ
diff --git a/app/src/main/res/drawable-hdpi/baseline_face_retouching_natural_white_48.png b/app/src/main/res/drawable-hdpi/baseline_face_retouching_natural_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..a5ee6b45a25ab42fa5f38c6bc54eb160b3db6dc6
Binary files /dev/null and b/app/src/main/res/drawable-hdpi/baseline_face_retouching_natural_white_48.png differ
diff --git a/app/src/main/res/drawable-hdpi/baseline_portrait_white_48.png b/app/src/main/res/drawable-hdpi/baseline_portrait_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..8e6aeda802f437a86decd9c306e0acffc732cdb9
Binary files /dev/null and b/app/src/main/res/drawable-hdpi/baseline_portrait_white_48.png differ
diff --git a/app/src/main/res/drawable-hdpi/baseline_switch_camera_white_48.png b/app/src/main/res/drawable-hdpi/baseline_switch_camera_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..27e8659083c7ec3f9ed7ac2f6da0e7c8752b5e6c
Binary files /dev/null and b/app/src/main/res/drawable-hdpi/baseline_switch_camera_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/baseline_bedtime_white_48.png b/app/src/main/res/drawable-mdpi/baseline_bedtime_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..de0eac8c8a9306cff28058a6f54f6535461276ed
Binary files /dev/null and b/app/src/main/res/drawable-mdpi/baseline_bedtime_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/baseline_delete_white_48.png b/app/src/main/res/drawable-mdpi/baseline_delete_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..ad2fa12f66498700c7434afaa8ce7a2d017b693e
Binary files /dev/null and b/app/src/main/res/drawable-mdpi/baseline_delete_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/baseline_face_retouching_natural_white_48.png b/app/src/main/res/drawable-mdpi/baseline_face_retouching_natural_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..a7035beeb2e094b4a1f98c2e89d49f4b931c3ea7
Binary files /dev/null and b/app/src/main/res/drawable-mdpi/baseline_face_retouching_natural_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/baseline_portrait_white_48.png b/app/src/main/res/drawable-mdpi/baseline_portrait_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..10e3f08b8a3bb4be84bfba1ddb6080b5e0bbde7c
Binary files /dev/null and b/app/src/main/res/drawable-mdpi/baseline_portrait_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/baseline_switch_camera_white_48.png b/app/src/main/res/drawable-mdpi/baseline_switch_camera_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..b981fc24f73d9bb95b22b37e274e363380f6e78c
Binary files /dev/null and b/app/src/main/res/drawable-mdpi/baseline_switch_camera_white_48.png differ
diff --git a/app/src/main/res/drawable-mdpi/expo_icon.png b/app/src/main/res/drawable-mdpi/expo_icon.png
index 5cf89e836713e1213a47e54bd4441ea76657b2e6..1ac5d28f3bbba70c884dbd6f4b44de60a49c18bd 100644
Binary files a/app/src/main/res/drawable-mdpi/expo_icon.png and b/app/src/main/res/drawable-mdpi/expo_icon.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_bracket_icon.png b/app/src/main/res/drawable-mdpi/focus_bracket_icon.png
deleted file mode 100644
index 8042ba781062f2249d471e3b071fa9c57f83a413..0000000000000000000000000000000000000000
Binary files a/app/src/main/res/drawable-mdpi/focus_bracket_icon.png and /dev/null differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_auto.png b/app/src/main/res/drawable-mdpi/focus_mode_auto.png
index 65593db0f97873735bbe551d2c702e7e72bb4f9d..d4a27da69de6ebd36017e13a6759a03874e8ea36 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_auto.png and b/app/src/main/res/drawable-mdpi/focus_mode_auto.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_continuous_picture.png b/app/src/main/res/drawable-mdpi/focus_mode_continuous_picture.png
index 24bcb7d5e427273a5aaf9f7e5a391cc2143ccd21..2979dafd62c8c915fc0373b220d1e0dc5f9514d1 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_continuous_picture.png and b/app/src/main/res/drawable-mdpi/focus_mode_continuous_picture.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_continuous_video.png b/app/src/main/res/drawable-mdpi/focus_mode_continuous_video.png
index 24bcb7d5e427273a5aaf9f7e5a391cc2143ccd21..2979dafd62c8c915fc0373b220d1e0dc5f9514d1 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_continuous_video.png and b/app/src/main/res/drawable-mdpi/focus_mode_continuous_video.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_edof.png b/app/src/main/res/drawable-mdpi/focus_mode_edof.png
index 7a74bab00af967cd60583fa96fbd97e19dcd75ad..6c5474f3205f4aec604e8426cd3f247fcf17fcf6 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_edof.png and b/app/src/main/res/drawable-mdpi/focus_mode_edof.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_fixed.png b/app/src/main/res/drawable-mdpi/focus_mode_fixed.png
index 8f07487c24eeeb87f173df73197bb4dffe5e0276..5d03fd580f60a4b160b8ba412007a6a0c7bf50d8 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_fixed.png and b/app/src/main/res/drawable-mdpi/focus_mode_fixed.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_infinity.png b/app/src/main/res/drawable-mdpi/focus_mode_infinity.png
index 7e8c1502004d2bb2086ae62571f9d5c010631937..3068e51e7070c3506218e3df76df3759dda4dfef 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_infinity.png and b/app/src/main/res/drawable-mdpi/focus_mode_infinity.png differ
diff --git a/app/src/main/res/drawable-mdpi/focus_mode_manual.png b/app/src/main/res/drawable-mdpi/focus_mode_manual.png
index b7e5fbbaad9c4e5c87731450df3e6c63fb23d984..4efce75eb1dfe3f9c66aadfdd2874e24356fa82e 100644
Binary files a/app/src/main/res/drawable-mdpi/focus_mode_manual.png and b/app/src/main/res/drawable-mdpi/focus_mode_manual.png differ
diff --git a/app/src/main/res/drawable-mdpi/white_balance_locked.png b/app/src/main/res/drawable-mdpi/white_balance_locked.png
index 70f72b066bffaaf64d05586d7233c49cf39e0e97..d8e976a1fbda4305459f6cbe8ae7606e8f9dba38 100644
Binary files a/app/src/main/res/drawable-mdpi/white_balance_locked.png and b/app/src/main/res/drawable-mdpi/white_balance_locked.png differ
diff --git a/app/src/main/res/drawable-mdpi/white_balance_unlocked.png b/app/src/main/res/drawable-mdpi/white_balance_unlocked.png
index 6191a2339f32dd14cf9abab856013c80c5218996..249c781d47117a5536086ce2a16f0b45e35aaa93 100644
Binary files a/app/src/main/res/drawable-mdpi/white_balance_unlocked.png and b/app/src/main/res/drawable-mdpi/white_balance_unlocked.png differ
diff --git a/app/src/main/res/drawable-xhdpi/baseline_bedtime_white_48.png b/app/src/main/res/drawable-xhdpi/baseline_bedtime_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..b2ab295001eafa8807759d38ba3a97be2244367d
Binary files /dev/null and b/app/src/main/res/drawable-xhdpi/baseline_bedtime_white_48.png differ
diff --git a/app/src/main/res/drawable-xhdpi/baseline_delete_white_48.png b/app/src/main/res/drawable-xhdpi/baseline_delete_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..a73379951d7a8eece2b6698bfb08eeea2665c8ef
Binary files /dev/null and b/app/src/main/res/drawable-xhdpi/baseline_delete_white_48.png differ
diff --git a/app/src/main/res/drawable-xhdpi/baseline_face_retouching_natural_white_48.png b/app/src/main/res/drawable-xhdpi/baseline_face_retouching_natural_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..1e412cc265621ea7207337ee5e97c9adca014378
Binary files /dev/null and b/app/src/main/res/drawable-xhdpi/baseline_face_retouching_natural_white_48.png differ
diff --git a/app/src/main/res/drawable-xhdpi/baseline_portrait_white_48.png b/app/src/main/res/drawable-xhdpi/baseline_portrait_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..b1289cb11170503795f2582be7d5feba69b84694
Binary files /dev/null and b/app/src/main/res/drawable-xhdpi/baseline_portrait_white_48.png differ
diff --git a/app/src/main/res/drawable-xhdpi/baseline_switch_camera_white_48.png b/app/src/main/res/drawable-xhdpi/baseline_switch_camera_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..3754beb48642930ca7b66109847269d29f7b3408
Binary files /dev/null and b/app/src/main/res/drawable-xhdpi/baseline_switch_camera_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/baseline_bedtime_white_48.png b/app/src/main/res/drawable-xxhdpi/baseline_bedtime_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..dabbb83879dc473c9b8ef74b23542db5cc15879a
Binary files /dev/null and b/app/src/main/res/drawable-xxhdpi/baseline_bedtime_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/baseline_delete_white_48.png b/app/src/main/res/drawable-xxhdpi/baseline_delete_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..c647fead353a482f98553fbbfe139a1344d77d7e
Binary files /dev/null and b/app/src/main/res/drawable-xxhdpi/baseline_delete_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/baseline_face_retouching_natural_white_48.png b/app/src/main/res/drawable-xxhdpi/baseline_face_retouching_natural_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..9457ceb411ad4d4a606263679a68d0cf13b89064
Binary files /dev/null and b/app/src/main/res/drawable-xxhdpi/baseline_face_retouching_natural_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/baseline_portrait_white_48.png b/app/src/main/res/drawable-xxhdpi/baseline_portrait_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..df4b6784cdc4e235b4e36c4796d66cedc5eba763
Binary files /dev/null and b/app/src/main/res/drawable-xxhdpi/baseline_portrait_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/baseline_switch_camera_white_48.png b/app/src/main/res/drawable-xxhdpi/baseline_switch_camera_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..7c63ba16287c69011c8f214f7bcc85948cb51bba
Binary files /dev/null and b/app/src/main/res/drawable-xxhdpi/baseline_switch_camera_white_48.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/expo_icon.png b/app/src/main/res/drawable-xxhdpi/expo_icon.png
index e84008a50dfc19de1b6ea5892c4fd405d9318c31..e68d4dbf497f88a9869534343930923e9cb7e33c 100644
Binary files a/app/src/main/res/drawable-xxhdpi/expo_icon.png and b/app/src/main/res/drawable-xxhdpi/expo_icon.png differ
diff --git a/app/src/main/res/drawable-xxhdpi/focus_bracket_icon.png b/app/src/main/res/drawable-xxhdpi/focus_bracket_icon.png
deleted file mode 100644
index aab12557a15122c44bf1cffd0459254f014a6e01..0000000000000000000000000000000000000000
Binary files a/app/src/main/res/drawable-xxhdpi/focus_bracket_icon.png and /dev/null differ
diff --git a/app/src/main/res/drawable-xxxhdpi/baseline_bedtime_white_48.png b/app/src/main/res/drawable-xxxhdpi/baseline_bedtime_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..3bda163cf49514723c35ca808db415425cae2046
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/baseline_bedtime_white_48.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/baseline_delete_white_48.png b/app/src/main/res/drawable-xxxhdpi/baseline_delete_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..b40da7917b111877f34a7377a38ffa69d27a43e9
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/baseline_delete_white_48.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/baseline_face_retouching_natural_white_48.png b/app/src/main/res/drawable-xxxhdpi/baseline_face_retouching_natural_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..6a8acc337a1217fc493687517cc93238c25a4640
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/baseline_face_retouching_natural_white_48.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/baseline_portrait_white_48.png b/app/src/main/res/drawable-xxxhdpi/baseline_portrait_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..3c0064bd1e3410f270ee7b9c845c7d53dade4b16
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/baseline_portrait_white_48.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/baseline_switch_camera_white_48.png b/app/src/main/res/drawable-xxxhdpi/baseline_switch_camera_white_48.png
new file mode 100644
index 0000000000000000000000000000000000000000..b2a60a8ece20af48f1e23547d9848e80b9265952
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/baseline_switch_camera_white_48.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/flash_auto.png b/app/src/main/res/drawable-xxxhdpi/flash_auto.png
new file mode 100644
index 0000000000000000000000000000000000000000..60e6515190dfee513b261b8da911e88b1611f929
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/flash_auto.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/flash_off.png b/app/src/main/res/drawable-xxxhdpi/flash_off.png
new file mode 100644
index 0000000000000000000000000000000000000000..2e8637c8f6524eefa94327c30430be2bc7dc0eb3
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/flash_off.png differ
diff --git a/app/src/main/res/drawable-xxxhdpi/flash_on.png b/app/src/main/res/drawable-xxxhdpi/flash_on.png
new file mode 100644
index 0000000000000000000000000000000000000000..8d8f1c9b81df59ca6b48e7b3753e287c0cfe509f
Binary files /dev/null and b/app/src/main/res/drawable-xxxhdpi/flash_on.png differ
diff --git a/app/src/main/res/layout/activity_device_select.xml b/app/src/main/res/layout/activity_device_select.xml
index 930c4b4c6e677c827e760a0879e0414c06964916..b8d79cf7b92a95a94d4b511f67bb714ec7e698c7 100644
--- a/app/src/main/res/layout/activity_device_select.xml
+++ b/app/src/main/res/layout/activity_device_select.xml
@@ -28,8 +28,9 @@
android:layout_marginTop="0dp"
android:text="@string/bluetooth_scan" />
+
+
+
+
+
+
+
diff --git a/app/src/main/res/layout/alertdialog_textview.xml b/app/src/main/res/layout/alertdialog_textview.xml
new file mode 100644
index 0000000000000000000000000000000000000000..87ef8dcd055d614026cb28ee42bbec32833d53be
--- /dev/null
+++ b/app/src/main/res/layout/alertdialog_textview.xml
@@ -0,0 +1,12 @@
+
+
+
diff --git a/app/src/main/res/layout/arrayseekbarpreference.xml b/app/src/main/res/layout/arrayseekbarpreference.xml
index 994fee8d8950dfd3314418d4db8235ce9e075626..d0a6a391739b734177777daf5ea0aa2bdbb960c9 100644
--- a/app/src/main/res/layout/arrayseekbarpreference.xml
+++ b/app/src/main/res/layout/arrayseekbarpreference.xml
@@ -13,6 +13,7 @@
diff --git a/app/src/main/res/layout/myedittextpreference.xml b/app/src/main/res/layout/myedittextpreference.xml
new file mode 100644
index 0000000000000000000000000000000000000000..4ca3fa6c9a43ed9543fa844ec5d44baf9935a03e
--- /dev/null
+++ b/app/src/main/res/layout/myedittextpreference.xml
@@ -0,0 +1,29 @@
+
+
+
+
+
+
diff --git a/app/src/main/res/layout/popupview_arrayoptions.xml b/app/src/main/res/layout/popupview_arrayoptions.xml
new file mode 100644
index 0000000000000000000000000000000000000000..2429e25706bfd698ce331b9dfea64a98cf58ad5d
--- /dev/null
+++ b/app/src/main/res/layout/popupview_arrayoptions.xml
@@ -0,0 +1,30 @@
+
+
+
+
+
+
+
+
diff --git a/app/src/main/res/layout/popupview_button.xml b/app/src/main/res/layout/popupview_button.xml
new file mode 100644
index 0000000000000000000000000000000000000000..3b200f4affdbd7af019de6c303da44a70b93eb66
--- /dev/null
+++ b/app/src/main/res/layout/popupview_button.xml
@@ -0,0 +1,11 @@
+
+
+
diff --git a/app/src/main/res/layout/popupview_radiobutton.xml b/app/src/main/res/layout/popupview_radiobutton.xml
new file mode 100644
index 0000000000000000000000000000000000000000..093c9316102e8cc8d4797406a4c2be679fea7492
--- /dev/null
+++ b/app/src/main/res/layout/popupview_radiobutton.xml
@@ -0,0 +1,11 @@
+
+
+
diff --git a/app/src/main/res/layout/popupview_switch.xml b/app/src/main/res/layout/popupview_switch.xml
new file mode 100644
index 0000000000000000000000000000000000000000..e6b5fa125905f71cb9e34f98db649776da8ebdb0
--- /dev/null
+++ b/app/src/main/res/layout/popupview_switch.xml
@@ -0,0 +1,14 @@
+
+
+
diff --git a/app/src/main/res/layout/popupview_textview.xml b/app/src/main/res/layout/popupview_textview.xml
new file mode 100644
index 0000000000000000000000000000000000000000..7e4c439890b13bce5074fa63e2c6c9e1b7787ed6
--- /dev/null
+++ b/app/src/main/res/layout/popupview_textview.xml
@@ -0,0 +1,20 @@
+
+
+
diff --git a/app/src/main/res/layout/stamp_image_text.xml b/app/src/main/res/layout/stamp_image_text.xml
new file mode 100644
index 0000000000000000000000000000000000000000..d63a23bd0195d94b34af8adf125729889b41cb9e
--- /dev/null
+++ b/app/src/main/res/layout/stamp_image_text.xml
@@ -0,0 +1,19 @@
+
+
+
+
+
+
+
diff --git a/app/src/main/res/layout/toast_textview.xml b/app/src/main/res/layout/toast_textview.xml
new file mode 100644
index 0000000000000000000000000000000000000000..2d26286347ab443be49a758583503b28eab9b497
--- /dev/null
+++ b/app/src/main/res/layout/toast_textview.xml
@@ -0,0 +1,13 @@
+
+
+
diff --git a/app/src/main/res/menu/main.xml b/app/src/main/res/menu/main.xml
index c00202823049e8d5b1590e1223f20fef8a966353..c2b1c0c91ed1d60a5da0285f997f03415f712eda 100644
--- a/app/src/main/res/menu/main.xml
+++ b/app/src/main/res/menu/main.xml
@@ -1,9 +1,10 @@
-