All the while during the iOS 7 BETA phase I’ve been filing Radars and produced samples to go with them to demonstrate the issue to Apple engineers. I wasn’t allowed to blog about these until the general release of iOS 7 and so I kept collecting them in a special folder on my dropbox.
Now that iOS 7 is out I am able to add the samples to my public Radar Samples repo on GitHub. I hope that they can be a good example of how to create samples that allow Apple engineers to quickly reproduce and debug those issues.
CATiledLayer Loses Visible Tiles on Memory Warning
Filed as rdar://14942449, sample is CATiledLayerMemory. A possible workaround might be to track the memory warning and to set the layer’s contents property to nil and setNeedsDisplay as soon as the user scrolls.
A CATiledLayer visible on screen will lose visible tiles if you start scrolling after receiving a memory warning.
Steps to Reproduce
1. Run the provided sample app in iOS 7 Simulator
2. Scroll down a little (optional)
3. Simulate Memory Warning via Simulator menu option
4. Begin to scroll
Visible Tiles should move during scrolling and remain visible
Visible Tiles disappear and the white background becomes visible
It cannot be reproduced under iOS 6, so this is a regression. Also, if you don’t scroll right after the memory warning, you will see the area behind the status bar clock to to white.
This is consistently reproducible on iOS 7 simulator and it has been reported to occur on a memory-intensive app on device as well.
Rounded rect bezier path with radius 3 drawn with scale 1 produces visual artifacts
Filed as rdar://14954549, sample is RoundedRectArtifact. The workaround is to pick a different corner radius.
If you draw a UIBezierPath rounded rect with corner radius 3 into a bitmap context with scale 1 then you get visual artifacts at the top corners.
Steps to Reproduce
Run the provided sample app with Xcode 5 on iOS 7 Simulator
The middle image with corner radius 3 should be smooth.
The middle image has strong visual artifacts at the top corners. Also see the provided screen shot.
This is a regression. If you launch the sample app on 6.1 simulator there are no artifacts.
NSAttributeDictionary Returns Color that produces EXC_BAD_ACCESS on stroking
Filed as rdar://14952597, sample is UIColorStrokeCrash. The workaround was to check for the iOS 6/7 attribute first and only retrieve the Core Text attribute if this is nil.
If you set an NSForegroundColor on an attributed string, the glyph run attributes contain a CTForegroundColor attribute that – if set as the stroke color on a graphics context – causes an EXC_BAD_ACCESS.
Steps to Reproduce
- Run the provided sample App on iOS 7 simulator
- push the “Crash Now” button
- The app should continue running.
- The glyph run’s NSAttributeDictionary should not return a value for kCTForegroundColorAttributeName.
- Or if it should, then the color should be valid to be used for stroking things
- The glyph run’s NSAttributeDictionary returns a CGColor object that appears to be a representation of the NSForegroundColor
- An EXC_BAD_ACCESS occurs on trying to stroke a rect with this color
The glyph run’s NSAttributeDictionary should not return a value for kCTForegroundColorAttributeName. I suspect that you have implemented NSAttributeDictionary to allow dual-access (Core Text and “modern attributes”) to certain attributes. If that is the case then you might have an over-releasing problem for the CGColor.
asl_search only finds messages with ReadUID set
Filed as rdar://14670536, sample is LogTest. Look at the AppDelegate.m. The workaround is to set the ReadUID to 501 or -1 if you are logging via a custom logger.
Messages logged via asl_log with no explicit ReadUID set cannot be found with asl_search.
Steps to Reproduce
- asl_log(NULL, NULL, ASL_LEVEL_NOTICE, “TEST2”);
- perform an asl_search without filter
- you should see TEST1 with level 4
- you should see TEST2 with level 5
- you only see TEST1, it has a ReadUID set of 501
- you don’t see TEST2
This is a new error as of iOS 7. Under iOS 6 the ReadUID was not required to be set to find these messages. Apparently ASL was also modified to only return the current app’s messages (which is fine), but I guess this is where this disparity came from.
You probably want to set the ReadUID to the current user’s ID (501), or to -1 by default on asl_log, OR let asl_search also find messages that have no ReadUID set. No ReadUID set should equal -1.
With the current way you need to create a new aslmsg just to set the ReadUID and cannot let asl_log create that for you by passing NULL.
AVCaptureMetadataOutput ignores Full-Screen rectOfInterest
Filed as rdar://14427767, sample is ScanAreaBug. My workaround is to have a substantially smaller “scan window” to insure that the horizontal or vertical scan line must cross the barcode. Further tests have shown that only the iPhone 4 seems to only have 2 scan lines, newer iPhones and iPads have more.
To scan for barcodes you would add a AVCaptureMetadataOutput to a AVCaptureSession. I noticed that – on an iPhone 4 – only the upper half of the camera video detects bar codes even though the rectOfInterest was set to (0,0,1,1). Setting it to something smaller seems to work, but setting it to full screen again has no effect.
Steps to Reproduce
- on an iPhone 4:
- set up AVCaptureMetadataOutput to look for EAN8,13 and UPC codes
- try to capture bar codes in the lower half of a full screen preview layer
- bar codes should be recognized anywhere on the preview layer if the rectOfInterest is not set
- only in the upper half of the preview layer bar codes are detected
- bar code scanning functionality is new as of iOS 7.
If you set a smaller rectOfInterest then this can also be in the lower half of the preview layer and it will work for detection. So there’s not a general detection problem for this region, only if the rectOfInterest is too large).
If this unexpected behavior has a rationale behind it (iPhone 4 performance?) then it should be documented. If there is a smaller rectOfInterest that is actually the default, then this should be set the the property instead of (0,0,1,1).
Engineering has provided the following feedback regarding this issue:
To scan 1 dimensional barcodes we use a fixed pattern of scan lines through the rect of interest. The expected behavior here is that you should (at minimum) be able to scan a 1D barcode whose first and last bars fall within either the horizontal or vertical center lines of the rect of interest. There may be other scan lines as well, depending on your device/AVCaptureMetadataOutput configuration, but the center horizontal and vertical scan lines are always on.
I created a demo. This has a rectOfInterest that is almost the entire screen. So this should recognize any bar code inside this area, at least that is what I would expect from the documentation or the WWDC video.
But the actual behavior – demonstrated in the sample app – is that only if the bar code is being touched by the horizontal or vertical center line it is being recognized. You can move a bar code into the area besides those center lines and it will not be recognized.
The engineer had stated “There may be other scan lines as well, depending on your device/AVCaptureMetadataOutput configuration, but the center horizontal and vertical scan lines are always on.”. I cannot see such additional scan lines. For a full screen rectOfInterest I only see one vertical and one horizontal scan line, that is on an iPhone 4.
For larger rectOfInterest you would need to add additional scan lines, I estimate you require about 4-5 vertically due to the minimum useful scan size of about 2 cm.
I thought that it would be a good idea to have a larger scan area. But the current behavior – while understandable and efficient – would cause developer confusion…. as it confused me.
Another thought: I would assume from the documentation that I can scan multiple bar codes at the same time. The only way to achieve that would be to have a larger rectOfInterest with multiple scan lines.
Further tests have shown that this behavior appears to be limited to older iPhones, like the iPhone 4. On an iPhone 5 and an iPad 4 there appear to be more scan lines. Still I am considering this issue as open because the number of scan lines is too limited for iPhone 4 to be useful. The current setup would force developers to have a UI with a smaller “scan window” solely for iPhone 4.
During this BETA test phase for iOS 7 I tried to elevate the quality of my sample projects to a higher level, hoping that this would enable the Apple engineers to more quickly address these issues. I’m sad to say that this didn’t work out. It doesn’t look like my issues were important enough next to loads of other more severe show-stoppers.
I do have several issues with samples that I was able to move into the “fixed as of iOS 7” section, though honestly only the QuickLook bug was actually addressed. The text-related issues were fixed by Apple revamping UIKit attributed text display via Text Kit and the one about full-screen-layout got obsoleted by the status bar changes for iOS 7.
The one critical bug in CATiledLayer got reported to me by a DTCoreText user way too late to matter for the iOS 7 GM. All other issues I found have relatively straightforward workarounds. Though it would be nice to see these issues addressed in some point release in the near future as well.
Categories: Bug Reports