mercredi 6 mai 2015

GMSMapView with custom tile overlay blinking when map is zooming

I use custom GMSTileLayer and my pre-defined tiles who stored in app.

When i increase/decrease map zoom level in the official google maps app, map "stretched" old tiles and remain on screen for as long as new needed tiles in loaded

In my app, with each change the zoom or frame the map is blinked for redraw tiles and old tiles are not displayed (i see grey color, and not old "stretched" tile).

How i make "right" zooming/tiling like in google maps official app?

How should I architect my app to process incoming Bluetooth data?

I'm working on a project where I have an app that receives data over BLE from a wearable peripheral, but I'm struggling with how to architect the app. Currently I have a singleton BLEManager class that constantly receives data in the background and then uses NSNotificationCenter to send it to the active view controller. This works but has gotten messy and seems non-ideal since I have multiple view controllers that each process the data in the same way and then just display it differently. Additionally, there are some settings related to the data processing that can be changed in app and need to be the same everywhere. It would be nice if the BLEManager sent the data to a central processing class and then the processed data was sent to the active view controller but I'm not sure the best way to set this up.

I could incorporate all the processing into the BLEManager class but then it would get pretty bloated and unintuitive and wouldn't be nice to work with moving forward. If I make a separate processing class thats a property of the BLEManager then I'd have to go through the BLEManager if I wanted to get or change any variables in the processing class from anywhere else which would be annoying. I could make a singleton processing class that receives data from the BLEManager and then sends it to the active VC but I've seen people say to avoid singletons so I'm hesitant to use another one even though this seems like it could be a good solution.

Is there a standard or recommended way to architect an iOS app to process incoming data from Bluetooth and then send it wherever its needed?

Invalid Binary with Invalid Signature

I am in the process of submitting my first Ionic application to the store. However I am receiving an error every single time that I try to submit my binary to the store:

Invalid Signature - A sealed resource is missing or invalid. The binary at path [Who Paid Last?.app/Who Paid Last?] contains an invalid signature. Make sure you have signed your application with a distribution certificate, not an ad hoc certificate or a development certificate....

I have verified that my certificates are correct. I have even tried moving my .git folder outside of the directory, but still no luck. I have tried about 7 different configurations and still the same result time and time again.

The crazy thing is that after my archive has been built, I validate using XCode's validator. The validator says that my .ipa has zero errors and that it is ready for submission, but iTunes Connect says differently.

I have also tried following this tutorial on troubleshooting but without any luck.

Submit a new build for iOS App for another account

I am an iOS developper and publish apps under my name.

I was requested by a friend to help him update his own app. He is registered as a developer himself, has valid accounts. And I would like to submit myself the new build. Of course he gave me the full source code.

He tried to add me as technical user but this was not accepted. Then I created a new mail account, which worked. What is next step ? I need his certificates, profiles,... ? Or is it just not possible ?

Of course I could update the code on my side then give him back the full directory. But not very satisfying.

Thank you.

AutoLayout create constraints programatically

enter image description here

How to correct pin black view to red view with padding I displayed using NSLayoutConstraint constraintWithItem.

So I placed red view in blue container with constraints. It works for me. So red square has appropriate width let's say 50pt and the same height. I've pin it to the left and top sides of blue container.

My goal is to pin black view to red view with some distance (yellow line) from red view say 5pt and then black view should stretch to the right side of blue container and for example green line should be 10pt.

So the lines just show const distances between views.

How to fetch data using alchemy api

I have my Cloudant database with table name tasks I want to implement AI in my iOS app. When a user clicks on AI button I need to show user tasks for particular day in natural language using the Alchemy Api.

How to do I integrate alchemy API in iOS app with Cloudant DB.

Phone gap -Apple rejects the app ITMS 90096 errors with Splash Screen

I am trying to deploy an app to App Store however apple rejects the app. I am getting an error ITMS :90096 which is specific to 4 inch splash screen for iPhone 5. I have added the splash into the app and referred it into the config.xml. I have tried various solutions I found on your site and also other sites but no luck. I am not sure how to fix it and its getting real frustrating now. Can someone help? The screenshot of the error is here ==> http://ift.tt/1zC3LAy

Get data after request is done

In my case, the main problem is that I can't get data after the user registered.

In my windows phone app i used to get data by firing event LoadCompleted.

This code shows how I've done it before:

private void AuthBrowser_LoadCompleted(object sender, NavigationEventArgs e)
    {
        string responceData = e.Uri.OriginalString;
        if (responceData.Contains("access_token"))
        {
            if (settings.Contains("IsRegistered"))
            {
                settings["IsRegistered"] = true;
            }
            else
            {
                settings.Add("IsRegistered", true);
            }
            var parameters = responceData.Split('#')[1].Split('&');
            var accessToken = parameters[0].Substring(parameters[0].IndexOf("=", StringComparison.Ordinal)).Remove(0, 1);
            if (settings.Contains("AccessToken"))
            {
                settings["AccessToken"] = accessToken;
            }
            else
            {
                settings.Add("AccessToken", accessToken);
            }
            var expiresIn = parameters[1].Substring(parameters[1].IndexOf("=", StringComparison.Ordinal)).Remove(0, 1);
            var uID = parameters[2].Substring(parameters[2].IndexOf("=", StringComparison.Ordinal)).Remove(0, 1);
            if (settings.Contains("UserId"))
            {
                settings["UserId"] = uID;
            }
            else
            {
                settings.Add("UserId", uID);
            }
            if (settings.Contains("IsRegistered"))
            {
                settings["IsRegistered"] = true;
            }
            else
            {
                settings.Add("IsRegistered", true);
            }
            NavigationService.Navigate(new Uri("/Menu.xaml", UriKind.Relative));
        }
    }

But now in my IOS app I don't understand how to do this.

I've used WKWebView.

This is my code sample:

import UIKit
import WebKit

class ViewController: UIViewController {

    @IBOutlet var containerView : UIView! = nil
    var webView: WKWebView?
    var uiWebView: UIWebView?
    override func loadView() {
        super.loadView()
        self.uiWebView = UIWebView()
        self.webView = WKWebView()
        self.view = self.webView!

    }

    override func viewDidLoad() {
        super.viewDidLoad()

        var url = NSURL(string:"http://ift.tt/1Jra8XG;" +
            "redirect_uri=http://ift.tt/1Jra8XI")
        var req = NSURLRequest(URL:url!)

        self.webView!.loadRequest(req)

    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }

}

Thanks.

Getting 'npm start' error when trying to build React Native app on iphone?

When I do a build to my iphone 6 I get the following error on my device with the red screen:

"Could not connect to development server. Ensure node server is running - run 'npm start' from React root

The operation couldn't be completed.
(NSURLErrorDomain error - 1004)"

And im getting the following error in terminal in the React Packager:

"Unhandled rejection Error: ENOENT, open '/Users/rahulsharma/Desktop/rcapp/.git/index.lock'"

Lastly, I am getting this in the xcode console:

2015-05-06 12:37:48.631 rcapp[808:262187] CLTilesManagerClient: XPC_ERROR_CONNECTION_INVALID!

Any ideas?

here is the code in git:

http://ift.tt/1P8JIeM

Regarding App development with Phonegap

I am comfortable with developing web apps and now I am getting into developing mobile Apps. I have searched around and found that Phonegap would be a great start to develop apps for multiple devices. Which is exactly I did. Currently, I have Phonegap installed with node.js. I have created my first hello world app that I can preview it on my Iphone(using Phonegap devloper app). I also learned that to build these Apps for multiple devices, I have to have an adobe id and do it on their website.

Now all that is clear to me. My questions are as follows.

  1. Can I use PHP with MySQl database with Phonegap for server and database? If not, what do I use? Node.js?

  2. Do I need to use Angular.js?

  3. How are these Apps hosted? Like on regular web hosting?

  4. I am assuming that since it's an App, I can use geo location services built into the phone? Much like how Tinder uses it to determine users locations?

  5. Is there anything else you think I should know before I get started?

How do I instantiate a UIView / UITableView from a xib (nib) in a ViewController

I want to display a user interface that has a segmented control, and a different table view for each section of the segmented control; so, 2 table views (buddies and bunches) that can be switched between. To implement this, I have done the following

  1. Create a ViewController in Storyboard
  2. Delete the View from the ViewController
  3. Create a new UIViewController swift class with an associated xib file
  4. Put the segmented control in the the main UIView in the xib
  5. Put a inner UIView element inside of the main UIView to take the space where the table views where replace it
  6. Created two subclasses of UITableView and corresponding xib files

Some options I have thought of:

  • I can set the class of the inner UIView in Interface Builder to be that of one of the table views, but I wouldn't know how to instantiate the other one in place of the initial one. If I created overlapping inner UIViews that each was associated with a table view and hiding one of them when I switch the segmented control, that actually kind of works, but the overlapping nature of views makes layout difficult and unintuitive.
  • What I want to know how to do: Instantiate the table views in place of the single main UIView element
  • Alternative: Have one UITableView subclass that has a condition based on the state of the segmented control for what data it displays. I don't like this as much because it will mix the code together for the table views. In this case, I wouldn't even need to use xibs anymore, I could do this in the storyboard with just one table view.

** ViewController Code **

@objc(BuddiesBunchesViewController) class BuddiesBunchesViewController: UIViewController {

    @IBOutlet weak var segmentedControl: UISegmentedControl!

    @IBOutlet weak var tableView: UIView!

    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view.

        // Instantiate tableView here to BuddiesTableView
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }

    @IBAction func segmentedControlIndexChanged(sender: AnyObject) {
        switch segmentedControl.selectedSegmentIndex {
        case 0: // Buddies
            // Set tableview to the buddies table view
        case 1: // Bunches
            // Set tableview to the buddies table view
        default:
            break;
        }
    }
}

** Table View **

@IBDesignable class BuddiesTableView: UITableView, UITableViewDataSource, UITableViewDelegate {

    var view: UIView!

    var nibName: String = "BuddiesTableView"

    //init

    override init(frame: CGRect) {
        // set properties

        super.init(frame: frame)

        // Set anything that uses the view or visible bounds
        setup()
    }

    required init(coder aDecoder: NSCoder) {
        //set properties

        super.init(coder: aDecoder)

        // Setup
        setup()
    }

    func setup() {
        view = loadViewFromNib()

        view.frame = self.bounds
        view.autoresizingMask = UIViewAutoresizing.FlexibleWidth | UIViewAutoresizing.FlexibleHeight

         addSubview(view)
    }

    func loadViewFromNib() -> UIView {
        let bundle = NSBundle(forClass: self.dynamicType)
        let nib = UINib(nibName: nibName, bundle: bundle)
        let view = nib.instantiateWithOwner(self, options: nil)[0] as! UIView

        return view
    }

// MARK: - Table View

    func tableView(tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
         return 10
    }

    func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell {
        let cell: UITableViewCell = UITableViewCell(style: UITableViewCellStyle.Subtitle, reuseIdentifier: "MyTestCell")
        cell.textLabel?.text = "\(indexPath.row)"
        return cell
    }
}

How can I restore custom photo albums that I've deleted from the photo app?

I've created an app to take and store pictures in a custom album of my choice using the following code:

// This block of code creates a new album for you
[self.library addAssetsGroupAlbumWithName: albumName

                              resultBlock: ^(ALAssetsGroup *group) {
                                  NSLog(@"Added awesome album:%@", albumName);
                              }

                             failureBlock: ^(NSError *error) {
                                 NSLog(@"Error adding album");
                             }];

Now let's say the custom album is called: "Foo". If I delete "Foo" from the iPhone photo app and then go to take a picture from the app I made to store a picture in the Foo Album, "Foo" is not re-created again when I check the iPhone photo app.

Plot Twist Though!: While debugging, "Foo" is actually still there and storing pictures, but it is not displaying when I re-open the iPhone photo app, "Foo" is no where to be found.

What can I do to make Foo re-appear or re-created again?

Thank you

How to create a crescent moon using bezier paths in iOS?

How to draw a closed crescent moon using bezier paths, when the stroke and fill are configurable? So car I can get once curve but haven't found a strategy to connect and draw the other curve.

enter image description here

UIImage Not Showing From Saved Directory in Swift

I am initiating a download of images from an external URL and then saving them to the app's documents directory for future loading into table view cells.

The images load successfully after the initial download and seem to be saved to the correct location (?), but when loading them from the directory path, I am just seeing a blank image view. Can anyone help me know what exactly I am doing wrong?

var paths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as! String
    var dirPath = paths.stringByAppendingPathComponent("images/\(self.contentids[indexPath.row])/")
    var imagePath = paths.stringByAppendingPathComponent("images/\(self.contentids[indexPath.row])/\(self.images[indexPath.row])")
    var checkImage = NSFileManager.defaultManager()

    var remoteImage = "\(baseUrl+self.images[indexPath.row])"

    if checkImage.fileExistsAtPath(imagePath) {

        let getImage = UIImage(contentsOfFile: imagePath)
        cell?.cellImageView.image = getImage

    } else {
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0)) {

            checkImage.createDirectoryAtPath(dirPath, withIntermediateDirectories: true, attributes: nil, error: nil)

            let getImage =  UIImage(data: NSData(contentsOfURL: NSURL(string: remoteImage)!)!)

            UIImageJPEGRepresentation(getImage, 100).writeToFile(imagePath, atomically: true)

            dispatch_async(dispatch_get_main_queue()) {

                cell?.cellImageView.image = getImage
                return

            }
        }
    }

Xcode - how to use localized string from attribute inspector

in the attribute inspector i have a placeholder (a.k.a a hint) for a textfield. The hint is called USERNAME. I'd like to localize this. I have already added the necessary strings into a localizable.strings file. My question is i would rather reference this string from the inspector itself. I wish i could do something like localstrings(@"username") right inside the placeholder field itself. I wish i could localize the placeholder string directly from interface builder is what i'm asking for. I've attached an image showing what i'd like to accomplish.

enter image description here

iOS Today Extension keeps crashing on iPhone, not in simulator

I'm trying to build an iOS Today Extension that shows three images with some text. In the simulator it runs fine, but when I run it on my iPhone, it flashes three times and then displays Unable to Load. What am I doing wrong?

TodayViewController.m

#import "TodayViewController.h"
#import <NotificationCenter/NotificationCenter.h>
#import "UIImageView+WebCache.h"
#import "SDImageCache.h"

@interface TodayViewController () <NCWidgetProviding>

@property (strong, nonatomic) UILabel *descriptionLabel;

@property (strong, nonatomic) UIImageView *firstImage;
@property (strong, nonatomic) UIImageView *secondImage;
@property (strong, nonatomic) UIImageView *thirdImage;

@property (strong, nonatomic) UILabel *firstImageLabel;
@property (strong, nonatomic) UILabel *secondImageLabel;
@property (strong, nonatomic) UILabel *thirdImageLabel;

@property (strong, nonatomic) UILabel *firstImageOwnerLabel;
@property (strong, nonatomic) UILabel *secondImageOwnerLabel;
@property (strong, nonatomic) UILabel *thirdImageOwnerLabel;

@property (strong, nonatomic) NSDictionary *dataOne;
@property (strong, nonatomic) NSDictionary *dataTwo;
@property (strong, nonatomic) NSDictionary *dataThree;

@property (nonatomic) NSInteger quarterSize;
@property (nonatomic) NSInteger eightSize;

@end

@implementation TodayViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view from its nib.

    self.preferredContentSize = CGSizeMake(self.view.frame.size.width, 320);

    [self updateNumberLabelText];

    if ([self.dataOne count] == 0) {
        UILabel *noContent = [[UILabel alloc] initWithFrame:CGRectMake((self.view.frame.size.width/2)-150, 93, 300, 44)];
        noContent.text = @"You haven't opened the app yet.";
        [self.view addSubview:noContent];
    } else {
        NSString *deviceType = [UIDevice currentDevice].model;

        if([deviceType isEqualToString:@"iPhone"] || [deviceType isEqualToString:@"iPhone Simulator"])
        {
            self.quarterSize = self.view.frame.size.width/4;
            self.eightSize = self.quarterSize/4;
        } else if([deviceType isEqualToString:@"iPad"] || [deviceType isEqualToString:@"iPad Simulator"])
        {
            self.quarterSize = self.view.frame.size.width/5;
            self.eightSize = self.quarterSize/4;
        }

        self.descriptionLabel = [[UILabel alloc] initWithFrame:CGRectMake(self.eightSize, 15, self.view.frame.size.width-self.quarterSize, 20)];
        self.descriptionLabel.text = @"Some new images just for you!";
        self.descriptionLabel.textColor = [UIColor whiteColor];
        [self.view addSubview:self.descriptionLabel];

        UIView *firstView = [[UIView alloc] initWithFrame:CGRectMake(self.eightSize, 45, self.quarterSize, self.quarterSize*2)];
        UITapGestureRecognizer *singleFingerTap =
        [[UITapGestureRecognizer alloc] initWithTarget:self
                                            action:@selector(openFirstImage:)];
    [firstView addGestureRecognizer:singleFingerTap];

        if ([[self.dataOne objectForKey:@"imageurl"] isEqualToString:@"empty"]) {
            UIView *noImageOne = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            noImageOne.backgroundColor = [self paperColorLightBlue500];
            [firstView addSubview:noImageOne];
        } else {
            self.firstImage = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            __block UIActivityIndicatorView *activityIndicator;
            __weak UIImageView *weakImageView = self.firstImage;
            [self.firstImage sd_setImageWithURL: [NSURL URLWithString:[self.dataOne objectForKey:@"imageurl"]]
                 placeholderImage:[UIImage imageNamed:@"placeholder.png"]
                              options:SDWebImageProgressiveDownload
                             progress:^(NSInteger receivedSize, NSInteger expectedSize) {
                                 if (!activityIndicator) {
                                     [weakImageView addSubview:activityIndicator = [UIActivityIndicatorView.alloc initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray]];
                                     activityIndicator.center = weakImageView.center;
                                     [activityIndicator startAnimating];
                                 }
                             }
                            completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL) {
                                [activityIndicator removeFromSuperview];
                                activityIndicator = nil;
                            }];
            self.firstImage.contentMode = UIViewContentModeScaleAspectFill;
            [self.firstImage setClipsToBounds:YES];

            [firstView addSubview:self.firstImage];
        }

        UIView *secondView = [[UIView alloc] initWithFrame:CGRectMake(firstView.frame.origin.x + firstView.frame.size.width + self.eightSize, 45, self.quarterSize, self.quarterSize*2)];
        UITapGestureRecognizer *secondFingerTap =
        [[UITapGestureRecognizer alloc] initWithTarget:self
                                            action:@selector(openSecondImage:)];
        [secondView addGestureRecognizer:secondFingerTap];

        if ([[self.dataTwo objectForKey:@"imageurl"] isEqualToString:@"empty"]) {
            UIView *noImageTwo = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            noImageTwo.backgroundColor = [self paperColorLightBlue500];
            [secondView addSubview:noImageTwo];
        } else {
            self.secondImage = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            __block UIActivityIndicatorView *activityIndicator;
            __weak UIImageView *weakImageView = self.secondImage;
            [self.secondImage sd_setImageWithURL: [NSURL URLWithString:[self.dataTwo objectForKey:@"imageurl"]]
                                 placeholderImage:[UIImage imageNamed:@"placeholder.png"]
                                          options:SDWebImageProgressiveDownload
                                         progress:^(NSInteger receivedSize, NSInteger expectedSize) {
                                             if (!activityIndicator) {
                                                 [weakImageView addSubview:activityIndicator = [UIActivityIndicatorView.alloc initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray]];
                                                 activityIndicator.center = weakImageView.center;    
                                                 [activityIndicator startAnimating];
                                             }
                                         }
                                        completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL) {
                                            [activityIndicator removeFromSuperview];
                                            activityIndicator = nil;
                                        }];
            self.secondImage.contentMode = UIViewContentModeScaleAspectFill;
            [self.secondImage setClipsToBounds:YES];

            [secondView addSubview:self.secondImage];
        }

        UIView *thirdView = [[UIView alloc] initWithFrame:CGRectMake(secondView.frame.origin.x + secondView.frame.size.width + self.eightSize, 45, self.quarterSize, self.quarterSize*2)];
        UITapGestureRecognizer *thirdFingerTap =
        [[UITapGestureRecognizer alloc] initWithTarget:self
                                            action:@selector(openThirdImage:)];
        [thirdView addGestureRecognizer:thirdFingerTap];

        if ([[self.dataThree objectForKey:@"imageurl"] isEqualToString:@"empty"]) {
            UIView *noImageThird = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            noImageThird.backgroundColor = [self paperColorLightBlue500];
            [thirdView addSubview:noImageThird];
        } else {
            self.thirdImage = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.quarterSize, self.quarterSize*1.25)];
            __block UIActivityIndicatorView *activityIndicator;
            __weak UIImageView *weakImageView = self.thirdImage;
            [self.thirdImage sd_setImageWithURL: [NSURL URLWithString:[self.dataThree objectForKey:@"imageurl"]]
                                  placeholderImage:[UIImage imageNamed:@"placeholder.png"]
                                       options:SDWebImageProgressiveDownload
                                          progress:^(NSInteger receivedSize, NSInteger expectedSize) {
                                              if (!activityIndicator) {
                                                  [weakImageView addSubview:activityIndicator = [UIActivityIndicatorView.alloc initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray]];
                                                  activityIndicator.center = weakImageView.center;
                                                  [activityIndicator startAnimating];
                                              }
                                          }
                                         completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL) {
                                             [activityIndicator removeFromSuperview];
                                             activityIndicator = nil;
                                         }];
            self.thirdImage.contentMode = UIViewContentModeScaleAspectFill;
            [self.thirdImage setClipsToBounds:YES];

            [thirdView addSubview:self.thirdImage];
        }

        self.firstImageLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImage.frame.origin.y + self.firstImage.frame.size.height + 10, self.quarterSize, 20)];
        self.firstImageLabel.text = [self.dataOne objectForKey:@"title"];
        self.firstImageLabel.numberOfLines = 2;
        self.firstImageLabel.textColor = [UIColor whiteColor];
        self.firstImageLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:13];
        [self.firstImageLabel sizeToFit];
        [firstView addSubview:self.firstImageLabel];

        self.secondImageLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImage.frame.origin.y + self.firstImage.frame.size.height + 10, self.quarterSize, 20)];
        self.secondImageLabel.text = [self.dataTwo objectForKey:@"title"];
        self.secondImageLabel.numberOfLines = 2;
        self.secondImageLabel.textColor = [UIColor whiteColor];
        self.secondImageLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:13];
        [self.secondImageLabel sizeToFit];
        [secondView addSubview:self.secondImageLabel];

        self.thirdImageLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImagele.frame.origin.y + self.firstImage.frame.size.height + 10, self.quarterSize, 20)];
        self.thirdImageLabel.text = [self.dataThree objectForKey:@"title"];
        self.thirdImageLabel.numberOfLines = 2;
        self.thirdImageLabel.textColor = [UIColor whiteColor];
        self.thirdImageLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:13];
        [self.thirdImageLabel sizeToFit];
        [thirdView addSubview:self.thirdImageLabel];

        self.firstImageOwnerLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImageLabel.frame.origin.y + self.firstImageLabel.frame.size.height, self.quarterSize, 30)];
        self.firstImageOwnerLabel.text = [self.dataOne objectForKey:@"owner"];
        self.firstImageOwnerLabel.numberOfLines = 1;
        self.firstImageOwnerLabel.textColor = [UIColor lightGrayColor];
        self.firstImageOwnerLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:11];
        [firstView addSubview:self.firstImageOwnerLabel];
        [self.view addSubview:firstView];

        self.secondImageOwnerLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImageLabel.frame.origin.y + self.firstImageLabel.frame.size.height, self.quarterSize, 30)];
        self.secondImageOwnerLabel.text = [self.dataTwo objectForKey:@"owner"];
        self.secondImageOwnerLabel.numberOfLines = 1;
        self.secondImageOwnerLabel.textColor = [UIColor lightGrayColor];
        self.secondImageOwnerLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:11];
        [secondView addSubview:self.secondImageOwnerLabel];
        [self.view addSubview:secondView];

        self.thirdImageOwnerLabel = [[UILabel alloc] initWithFrame:CGRectMake(0, self.firstImageLabel.frame.origin.y + self.firstImageLabel.frame.size.height, self.quarterSize, 30)];
        self.thirdImageOwnerLabel.text = [self.dataThree objectForKey:@"owner"];
        self.thirdImageOwnerLabel.numberOfLines = 1;
        self.thirdImageOwnerLabel.textColor = [UIColor lightGrayColor];
        self.thirdImageOwnerLabel.font = [UIFont fontWithName:@"HelveticaNeue" size:11];
        [thirdView addSubview:self.thirdImageOwnerLabel];
        [self.view addSubview:thirdView];
    }
}

- (UIColor *)paperColorLightBlue500     { return UIColorFromRGB(0x03a9f4); }

- (void)openFirstImage:(UITapGestureRecognizer *)recognizer {
    NSLog(@"Please open the First Image");
}

- (void)openSecondImage:(UITapGestureRecognizer *)recognizer {
    NSLog(@"Please open the Second Image");
}

- (void)openThirdImage:(UITapGestureRecognizer *)recognizer {
    NSLog(@"Please open the Third Image");
}

- (id)initWithCoder:(NSCoder *)aDecoder {
    if (self = [super initWithCoder:aDecoder]) {
        [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(userDefaultsDidChange:)
                                                 name:NSUserDefaultsDidChangeNotification
                                               object:nil];
    }
    return self;
}

- (UIEdgeInsets)widgetMarginInsetsForProposedMarginInsets:(UIEdgeInsets)defaultMarginInsets
{
    return UIEdgeInsetsZero;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
    self.firstImageLabel = nil;
    self.secondImageLabel = nil;
    self.thirdImageLabel = nil;
    self.firstImageOwnerLabel = nil;
    self.secondImageOwnerLabel = nil;
    self.thirdImageOwnerLabel = nil;
}

- (void)widgetPerformUpdateWithCompletionHandler:(void (^)(NCUpdateResult))completionHandler {
    // Perform any setup necessary in order to update the view.

    // If an error is encountered, use NCUpdateResultFailed
    // If there's no update required, use NCUpdateResultNoData
    // If there's an update, use NCUpdateResultNewData

    completionHandler(NCUpdateResultNewData);
}

- (void)userDefaultsDidChange:(NSNotification *)notification {
    [self updateNumberLabelText];
}

- (void)updateNumberLabelText {
    NSUserDefaults *defaults = [[NSUserDefaults alloc] initWithSuiteName:@"group.testapp.TodayExtensionDefaults"];
    self.dataOne = [defaults objectForKey:@"dataOne"];
    self.dataTwo = [defaults objectForKey:@"dataTwo"];
    self.dataThree = [defaults objectForKey:@"dataThree"];

    for (id key in self.dataOne) {
        NSLog(@"key: %@, value: %@ \n", key, [self.dataOne objectForKey:key]);
    }

    for (id key in self.dataThree) {
        NSLog(@"key: %@, value: %@ \n", key, [self.dataThree objectForKey:key]);
    }
}

@end

Can I use non-renewing subscriptions for an app which offers a service, but has no free features?

I want to write an app, which enables the user to calculate financial credit plans. The users will be professionals in the financial sector. The idea is that they pay for each month of usage, basically a subscription. The app will not have any free features, so the subscription does not "unlock" premium features, instead the subscription "unlocks" all functionality of the app. Can I use In-app non-renewable subscriptions for that? Can someone point me to a good tutorial?

I want to move my character

I want to move my character in a game by pressing on the screen and it should run to the left when you press at the left side of the screen (it is in landscape mode) and to the right if you press at the left side of the screen.

I've already tried some different things but they didn't work because I made something wrong. I have tried using the touchesBegan and touchesEnded functions because I read that it should work making it with these two functions but I had no success.

I'd be grateful if you could help me.

PS: I should say that I don't move the character, I have a ground node, some trees and the character. I moved the trees so that it looked like the character is moving and I'd like to make it this way.

CloudKit Modify Existing CKSubscription

An existing CKSubscription that has been saved to publicDatabase cannot be modified directly, can it ?

Looking at class documentations, it can only be deleted, and a new CKSubscription with new behavior can then be created.

Is this correct ?

Thanks

IOS distribution certificate is grayout

My app is completed and now I want to distribute it in App Store but the iOS distribution option is grayed-out. If I delete the previous distribution certificate then is there any problem in my app which is on App Store?

iOS: Facebook user info not being saved in Parse when registering

I'm using parse to register users from facebook in my iOS app. This is the code I have in my log in button:

PFFacebookUtils.logInWithPermissions(["public_profile", "email", "user_friends"], {
        (user: PFUser!, error: NSError!) -> Void in

After the user clicks it, the facebook app or safari opens up. After the user enters the email and password, it tells them that the app will see their public profile, email, friends.

Everything seems to work fine until this point. The problem comes when retrieving the user info from Parse:

<PFUser: 0x7fcd3ae5cae0, objectId: irrEruRILF, localId: (null)> {
username = LeFJc8ErtMLHqLmsonfNXq8Ck;
}

As you can see it gives me "username = LeFJc8ErtMLHqLmsonfNXq8Ck" but it doesn't have any profile information or email address... What am I missing?

WatchKit pull to refresh

Is it possible to implement a pull to refresh similar to the Watch email app? if not how else should I handle updating a WKInterfaceTable?

I don't really want the app to do an automatic refresh on load as this defeats the purpose of having a quick watch app.

AVPlayer display in UITableViewCell showed two more partial avplayers

I had a simple test app that has a tableview with 50 rows, and I try to put an avplayer {{0, 1232}, {320, 200}} in row 28 of a screen {{0, 0}, {320, 480}}. Every other row is a simple cell with textlabel of {320, 44}.

However I got a portion of the avplayer showing up on rows 5 and 17 when I scroll to row 28, where the full player showed up.

Any idea what might have gone wrong here?

correct dispaly![wrong display, appeared twice, rows 5 and 17

iOS app submission error : ERROR ITMS-90032: "Invalid Image Path - No image found at the path referenced under key 'CFBundleIcons': 'AppIcon29x29'"

I tired to submit archive to the App Store, I keep getting the following error:

ERROR ITMS-90032: "Invalid Image Path - No image found at the path referenced under key 'CFBundleIcons': 'AppIcon29x29'"

ERROR ITMS-90032: "Invalid Image Path - No image found at the path referenced under key 'CFBundleIcons': 'AppIcon40x40'"

ERROR ITMS-90032: "Invalid Image Path - No image found at the path referenced under key 'CFBundleIcons': 'AppIcon60x60'"

I am sure set all the app icons, but I still got the same error. (I have published apps before and I followed the same process) I am using xcode 6.3.

I have checked the related questions, many suggested that remove the CarPlay icon. But i don't see such icon after i click into the appicons. The only icons requires are of size: 58x58, 87x87, 80x80, 120x120, 180x180. I didn't see anywhere request icons of size: 29x29, 40x40,60x60. Any one has the same problem? Thank you so much in advance for your help!

cannot change class variables while runtime swift

That's an what I want to do. I want to change a class variable while the project is running and use that in player.

let pa = PlayerAv()

class PlayerAv
{
    var audioLink = ""
    var player: AVPlayer
    init()
    {
        player = AVPlayer(URL: NSURL(string: self.audioLink))
    }
}

@IBAction func changeToSabiha() {
    pa.player.pause()
    PlayerAv().audioLink = "http://ift.tt/1cnbIQ0"
    println("\(pa.audioLink)")
    pa.player.play()
}

source view controller vs. presenting view controller

I'm reading a book that states that source view controller is not necessarily a presenting view controller. The source VC is the one that calls presentViewController:... method, and the presenting VC(it's view) is the one that gets obscured by a presented VC view. I can't think of a single example in which the presenting VC is not the same as source VC. Please provide some. Thanks

xcode - MPNowPlayingInfoCenter info is not displayed on iOS 8

I'm developing a music application, which should play music in the background.

I use the MPMoviePlayerController to play the music. My code to initiate the MPMoviePlayerController:

NSString* resourcePath = [[NSBundle mainBundle] resourcePath];
resourcePath = [resourcePath stringByAppendingString:@"/music.m4a"];
NSError* err;
self.player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:resourcePath]];
if (err) {
    NSLog(@"ERROR: %@", err.localizedDescription);
}
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:nil];
[session setActive:YES error:nil];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self.player setShouldAutoplay:NO];
[self.player setControlStyle: MPMovieControlStyleEmbedded];
self.player.view.hidden = YES;
[self.player prepareToPlay];

When I execute [self.player play]; the music starts. But I also want to display the name of the song, the name of the album and the album artwork in the LockScreen and the ControlCenter. I'm using the following code:

Class playingInfoCenter = NSClassFromString(@"MPNowPlayingInfoCenter");
    if (playingInfoCenter) {
        NSMutableDictionary *songInfo = [[NSMutableDictionary alloc] init];
        MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage: [UIImage imageNamed:@"artwork.png"]];
        [songInfo setObject:@"SongName" forKey:MPMediaItemPropertyTitle];
        [songInfo setObject:@"ArtistName" forKey:MPMediaItemPropertyArtist];
        [songInfo setObject:@"AlbumTitle" forKey:MPMediaItemPropertyAlbumTitle];
        [songInfo setObject:albumArt forKey:MPMediaItemPropertyArtwork];
        [[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
    }

But nothing gets displayed in the LockScreen. It doesn't get displayed in the ControlCenter either.

How can I solve my problem? I didn't find anything on the internet.

Thanks in advance, Fabian.

Unexpected page breaks when printing UIWebView

TLDR: If you print UIWebView which contains HTML content consisting of elements with text aligned right / center, resulting document will have pages with unexpectedly large bottom margins.

I came across this issue, and was quite surprised when I could not google anyone with similar problem. I have filed a radar with apple (#20760071) and also created an issue on ResearchKit's GitHub repo, as this affects their ORKHTMLPDFWriter.

AFAIK this also affects all libraries that use UIWebView for converting HTML to PDF, I have tested:

I am wondering if anyone can come up with some workaround.

How to reproduce:

NSMutableString* html = [NSMutableString string];
[html appendString:@"<html><body>"];
for (int i=0; i<200; i++) {
    [html appendString:@"<div align=\"right\">line</div>"];
}
[html appendString:@"</body></html>"];

UIPrintInteractionController* pc = [UIPrintInteractionController sharedPrintController];

UIPrintInfo* printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGrayscale;

pc.printInfo = printInfo;
pc.showsPaperSelectionForLoadedPapers = YES;

UIWebView* web = [UIWebView new];
[web loadHTMLString:html baseURL:nil];
pc.printFormatter = web.viewPrintFormatter;

[pc.printPageRenderer addPrintFormatter:web.viewPrintFormatter startingAtPageAtIndex:0];

[pc presentAnimated:YES completionHandler:^(UIPrintInteractionController *printInteractionController, BOOL completed, NSError *error) {
    NSLog(@"%d -- %@",completed, error);

}];

You can also clone the project demonstrating this issue in ResearchKit.

Sort Parse Query based on difference with Float

I want to sort the results of my Parse Query based on the scalar distance from the field temperature to my variable referenceTemp

I tried the following, which of course does not work, but it illustrates my intention

var referenceTemp = 47.89
var query = PFQuery(className: "Temperatures")
query.orderByAscending("abs(temperature - referenceTemp)")

How can I do it? Thanks!!

Transparent and tranculent tab bar

I've for quite some time been trying to create a transparent tab bar which i've seen in many apps(check the image at the bottom). However i cant seem to get the effect where it is transparent and blurry at the same time. So far i've just added a tab bar and set the transculent to true/yes depending on objective-c or swift. How can i achieve below tab bar?

enter image description here

Updating ios app do I increment both the build number or just the version?

I am about to update my app for the first time. I am in xcode and looking at the project targets. My new version number is 1.0.1 so do I just change my version number here? Do I need to increment the build number too?

Lower volume of AVAudioPlayer sound

I have a background song that I use on an infinite loop while my app is in use. Except that the sound/volume is too high... I'd like it to be played at half the volume... This is how I load the music:

//background music for in-game loop
backgroundLoop = [[NSBundle mainBundle] pathForResource:@"Background" ofType:@"wav"];
GameMusic = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:backgroundLoop] error:NULL];
GameMusic.delegate = self;          //need <...> in .h
GameMusic.numberOfLoops = -1;       //play infinite loop

I've tried a few things, but nothing affects the volume of the track. I've tried this:

GameMusic.volume = 0.5f;

and:

[GameMusic setVolume: 0.5f];

but it doesn't change the volume and still plays at the same max volume...

The strange thing is, if I set the volume to 0.0 ([GameMusic setVolume: 0.5f];) it doesn't play at all because I'm guessing it's setting volume to zero... but why does 0.1f or 0.5f not adjust the volume when it's a linear scale from 0-1?

How can I lower the volume of the sound? It's far too loud.

Also, I tried editing the actual sounds in Garageband, but they won't export any sounds under a second long so it's not feasible to adjust them manually.

Thanks.

iOS NSDate() returns incorrect time

I am trying to get current date in Swift using NSDate(). When I create breakpoint and stop application, I can see that there is 3 hours difference with devices' system time. Application is running on real iPhone. How to fix this?

I also wrote the following in App Delegate to be sure:

NSTimeZone.setDefaultTimeZone(NSTimeZone(name: "Europe/Kiev")!)

But it still does not work.

Xcode 6 Do I have to make an application for every screen size?

So I have created my webapplication and when I build it in xcode I have to under "Simulated Metrics" > "Size" set it to 3.5 inch since I have a iPhone 4s. When I build it and run on the iPhone everything looks perfect, but I want this application to run on iPhone 5/6 aswell, but when I change the size to something else in the Simulated Metrics it gets really messed up on my iPhone. I'm using a webview of my responsive website which shouldn't really care about the screensize but I believe Xcode does. So, is there a solution where xcode automatically detects screensize and makes the webview take the whole viewcontroller? Or do I have to create a application for every screensize?

Why does my storyboard keep resizing my views

I have a UIStoryBoard with a ViewController that contains a UITableView. I added Views above and below the UITableView to act as a header/footer that will scroll with the UITableView, that is all working appropriately. The problem is when I resize the header/footer Views, they show up correctly in the preview, but they are not the correct size in the app.

Here is how it looks in StoryBoard:

Storyboard

And when I run it on the app the top View takes up the entire screen. But here is the weird thing, if I close Xcode and reopen it both the header and footer views have a height of 568, which I'm assuming is why they look so huge in the app. Why is Xcode resizing these views? I have tried removing all constraints but it doesn't make any difference.

I actually removed the ViewController entirely from the storyboard and remade it and it is still doing this same behavior.

Thanks in advance.

Determine target's Build Configuration type from static library

How can I get the build configuration type from a static library (debug or release)?
Normally we use #ifdef DEBUG but in this case it will not work, because this check is compile time and our static library is already compiled.

Write PDF Document into a file and open it with webview xamarin

I want to download a pdf file from an external server, save it on device then open it with webview. i already got the file from the server, the problem I'm having is writing it in the correct format into a file. I will appreciate any approach i can take to achieve this.

I was able to load a pdf that's already bundled on the app with webview.

`CAGradientLayer` Define Start and End Points in Terms of Local Coordinate Space

A CAGradientLayer has two properties startPoint and endPoint. These properties are defined in terms of the unit coordinate space. The result is that if I have two gradient layers with the same start and end points each with different bounds, the two gradients will be different.

How can the startPoint and endPoint of a CAGradientLayer layer be defined not in terms of the unit coordinate space but in standard point coordinates so that the angle/size of the gradient is not effected by the bounds of the layer?

The desired result is that a gradient layer can be resized to any size or shape and the gradient remain in place, although cropped differently.


Qualifications: I know that this seems like an absolutely trivial transformation between coordinate spaces, but apparently either, yes I am in fact that dumb, or perhaps there's something either broken or extremely counter-intuitive about how CAGradientLayers work. I haven't included an example of what I expect should be the right way to do it, because (assuming I'm just dumb) it would only be misleading.

How to detect audio transients in iOS

For my iOS app, I need to programmatically cut up a spoken phrase into words for further processing. I know what words to expect, so, I can make some assumptions on where words would start. However, in any case, a transient detection algorithm/method would be very helpful. Google points me to either commercial products, or highly academic papers that are beyond my brain power. Luckily, you are much smarter and knowledgeable than me, so you can help my simplify my problems. Don't let me down!

How to get the list of maximum value for a column for a list of unique key

Let's say I have a table on Parse that has two columns: an identifier set by hand and a numeric property.

I need to write a query that gets me the maximum number on the numeric property per each unique identifier. So in the example below:

| identifier | value |
----------------------
| 1          | 10    |
| 2          | 5     |
| 1          | 7     |
| 2          | 9     |

I would expect the following output:

| identifier | value |
----------------------
| 1          | 10    |
| 2          | 9     |

Now I know Parse doesn't have anything like Group By statements, so this is probably not doable as a single query.

What alternative would you suggest in this case? I see some solutions, each with serious drawbacks:

  • Compose the result from multiple queries. This would require a query that gets the unique list of identifier and then a separate query for each identifier to get the maximum value. This will probably not scale well if the table grows in size. Also the result is not exactly consistent as the DB can change between queries (for my use case slightly stale date is not too bad). This will heavily impact the request quota limit as a single request can now trigger a large number of requests.
  • Keep a separate table that keeps track of this result. This table would have a single row for each identifier, containing the max value. For this I would need a beforeSave trigger that updates the second table. From what I've read there is no guarantee that beforeSave triggers are not executed concurrently so it's very tricky to ensure that I don't accidentally insert multiple values for the same identifier. I would probably have to run a background job that removes duplicates.

For my use case I'll need to get the data on an iOS device so network traffic is also an issue.

Getting Data from Apple iOS App Store?

I want to know the trending Apps every month in App Store.

How can I do that? I know a bit about using nokogiri, I'm asking the data source.

I find there is source for Google Play, Getting Data from Android Play Store, but didn't find data source for App Store

Why are cells no longer visibile in my UICollectionView?

The UICollectionViewController class

In my app there are two classes. The first one is aUICollectionViewController. Inside this class I declare a UICollectionView creating an IBOutlet connection.

The UICollectionViewLayout class

The second one, a UICollectionViewLayout, contains delegate methods such as collectionViewContentSize, layoutAttributesForElementsInRect: and layoutAttributesForItemAtIndexPath:.

class myLayout: UICollectionViewLayout {

    override func prepareLayout() {
       super.prepareLayout()
    }

    override func collectionViewContentSize() -> CGSize {
       let screenSize: CGRect = UIScreen.mainScreen().bounds
       var center = CGPoint(x: screenSize.width/2.0, y: screenSize.height/2.0)
       return super.collectionViewContentSize()
    }

    override func layoutAttributesForElementsInRect(rect: CGRect) -> [AnyObject]? {
       if let attributes = super.layoutAttributesForElementsInRect(rect) as? [UICollectionViewLayoutAttributes] {
            for attribute in attributes {
                let center = attribute.center
            }
        return attributes
        }
       return nil
     }

    override func layoutAttributesForItemAtIndexPath(indexPath:
    NSIndexPath) -> UICollectionViewLayoutAttributes {
        let attributes = super.layoutAttributesForItemAtIndexPath(indexPath)
        return attributes
     }
}

Assigning the layout to the UICollectionView

To link the UICollectionViewLayout to the main UICollectionView, I use the collectionViewLayout property in the viewDidLoad method:

override func viewDidLoad() {
    super.viewDidLoad()

    let layout = myLayout()
    myCollectionView.collectionViewLayout = layout
}

After assigning the layout, cells are no more visibile

After running the app, I can't no more see the UICollectionViewCells of the UICollectionView. They were visibile before assigning a new layout to the UICollectionView. I can only see the background of the UICollectionView.

Why are cells no longer visible?

didReceiveLocalNotification is never called in app icon touched launch

I can't find any right solution even after long searching. What is normal scenario in iOS in belows?

When the app gets local notification,

1) didReceiveLocalNotification is invoked when the app is in foreground. and also is invoked when notification banned touched launch.

However, under the same situation,

2) didReceiveLocalNotification method is not invoked by app icon touched launch.

Stackoverflow or Apple document said

in case of 2), applicationWillEnterForeground -> didReceiveLocalNotification -> applicationDidBecomeActive delegates process is normal. but I've never seen didReceiveLocalNotification in the case of touching app icon.

Please, give me an advice case 2) is normal or not. Help me!

changing the position of all childs of a node except the parent

I have a main node and it has 5 childs. I want to change the position.y of the 5 childs without changing the position.y of the main node.

is there a way to do this? maybe something like:

for children in mainnode.children{
children.position.y = children.position.y - 10
}

I know that this isn't right, but maybe something like it.

I have been struggling with it for days now, anybody that can help me out?

BLE pheripheral not detected when ios app on background

i am develop a ble based iOS application on iOS 8.0. i have a peripheral device, that advertise data . am just open my application , the scanning started on viewdidload. then it detect the peripheral and listed in a table view. it is working fine. but when am switch off the peripheral device, after am start the iOS app, scanning started on didload, and am pressed center key, then app going to background. after am just switch on peripheral device. but it could not detect the device.

(current situation, am switch off and on the bluetooth of an iOS device manually. then background app detected the peripheral and got notification message.)

what is the actual problem with my app? and if any solutions?

google analytics SDK for IOS not tracking screens or events anymore. used to work

several months ago I downloaded the latest version of the google analytics SDK for IOS to try out tracking analytics for my app. I followed all the instructions, downloaded the SDK, stubbed out the code, created google analytics account and appropriate property with its respective reporting view, plugged in the tracking code and it all worked great. (this was something I was working on that was not ready for production yet, so we did not make it live).

I put it down for a few months, then recently resurrected the code I had written, and for some reason google analytics no longer seems to be keeping track of screens that I am on, nor does it keep track of events that I track.

I have verbose logging turned on, and this is about all of the information I get from the console now. I used to get a lot of info every time I sent data about a screen view or event to google analytics.

Google analytics console output:

VERBOSE: GoogleAnalytics 3.11 +[GAITrackerModel initialize] (GAITrackerModel.m:88): idfa class missing, won't collect idfa

this error message doesn't matter I do not think because I am not displaying any ads in my app

INFO: GoogleAnalytics 3.11 -[GAIReachabilityChecker reachabilityFlagsChanged:] (GAIReachabilityChecker.m:159): Reachability flags update: 0X000002

I believe this status code indicates I am able to reach google analytics

INFO: GoogleAnalytics 3.11 -[GAIBatchingDispatcher hitsForDispatch] (GAIBatchingDispatcher.m:368): No pending hits.

not really sure what this means, but after this message gets displayed I get no more feedback from google analytics whenever I try to send screen data or event data to it

Here is my code in the app delegate class:

// setup google analytics

// Optional: automatically send uncaught exceptions to Google Analytics.

 [GAI sharedInstance].trackUncaughtExceptions = YES;

// Optional: set Google Analytics dispatch interval to e.g. 20 seconds.
[GAI sharedInstance].dispatchInterval = 20;
[[GAI sharedInstance] setDryRun:NO];

// Optional: set Logger to VERBOSE for debug information.
[[[GAI sharedInstance] logger] setLogLevel:kGAILogLevelVerbose];

// Initialize tracker. Replace with your tracking ID.
[[GAI sharedInstance] trackerWithTrackingId:@"mytrackingIDgoeshere"];

id<GAITracker> tracker = [[GAI sharedInstance] defaultTracker];

tracker.allowIDFACollection = NO;

NSString *version = [[NSBundle mainBundle] `objectForInfoDictionaryKey:@"CFBundleShortVersionString"];`

[tracker set:kGAIAppVersion value:version];
[tracker set:kGAISampleRate value:@"50.0"];

Here is the code I use to manually send screen and event data.
Some of it is stubbed out to be made generic to not reveal personal info about my app.

// MANUALLY STARTING A SESSION:

// start session

id<GAITracker> tracker = [[GAI sharedInstance] defaultTracker];

// You only need to set User ID on a tracker once. By setting it on the tracker, the ID will be

// sent with all subsequent hits.

[tracker set:@"&uid"
       value:self.sessionID];

   [tracker send:[[GAIDictionaryBuilder 
createEventWithCategory:@"User ID"            
                 action:@"User Starts Session" 
                  label:deviceLabel              
                  value:nil] build]];   

GAIDictionaryBuilder *builder = [GAIDictionaryBuilder createScreenView];

[builder set:@"start" forKey:kGAISessionControl];
[tracker set:kGAIScreenName value:vcName];
[tracker send:[builder build]];

// MANUALLY ENDING A SESSION:

 id<GAITracker> tracker = [[GAI sharedInstance] defaultTracker];
    GAIDictionaryBuilder *builder = [GAIDictionaryBuilder createScreenView];
    [builder set:@"end" forKey:kGAISessionControl];
    [tracker set:kGAIScreenName value:vcName];
    [tracker send:[builder build]];

// MANUALLY SENDING SCREEN VIEW INFO TO GA

 id<GAITracker> tracker = [[GAI sharedInstance] defaultTracker];
    [tracker set:kGAIScreenName value:sceneName];
    [tracker send:[[GAIDictionaryBuilder createScreenView]build]];

// EXAMPLE CODE OF MANUALLY SENDING EVENT DATA TO GA

id<GAITracker> tracker = [[GAI sharedInstance] defaultTracker];
    [tracker set:kGAIScreenName value:sceneName];
    [tracker send:[[GAIDictionaryBuilder createEventWithCategory:@"ui_action" action:@"table_row_selected" label:rowTitle value:nil]build]];
     [tracker set:kGAIScreenName value:nil];

And that is it. This all used to work, and I keep pouring over the documentation over and over again and nothing seems to be wrong.

How do I troubleshoot this issue?

Thank you for your time.

Video Buffer Output with Swift

My goal is to take the video buffer and ultimately convert it to NSData but I do not understand how to access the buffer properly. I have the captureOutput function but I have not been successful if converting the buffer and I'm not sure I am actually collecting anything in the buffer. This is all using swift code, I have found some examples using Objective-C but I am not able to understand the Obj-c code well enough to figure it out.

var captureDevice : AVCaptureDevice?
var videoCaptureOutput = AVCaptureVideoDataOutput()
var bounds: CGRect = UIScreen.mainScreen().bounds
let captureSession = AVCaptureSession()
var captureConnection: AVCaptureMovieFileOutput?


override func viewDidLoad() {
    super.viewDidLoad()
    captureSession.sessionPreset = AVCaptureSessionPreset640x480
    let devices = AVCaptureDevice.devices()

    for device in devices {
        if (device.hasMediaType(AVMediaTypeVideo)) {
            if device.position == AVCaptureDevicePosition.Back {
                captureDevice = device as? AVCaptureDevice
                if captureDevice != nil {
                    beginSession()
                }
            }
        }
    }
}

func beginSession() {
    var screenWidth:CGFloat = bounds.size.width
    var screenHeight:CGFloat = bounds.size.height
    var err : NSError? = nil
    captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err)!)

    if err != nil {
        println("Error: \(err?.localizedDescription)")
    }

    videoCaptureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey:kCVPixelFormatType_32BGRA]
    videoCaptureOutput.alwaysDiscardsLateVideoFrames = true


    captureSession.addOutput(videoCaptureOutput)


    videoCaptureOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
    if captureSession.canAddOutput(self.videoCaptureOutput) {
        captureSession.addOutput(self.videoCaptureOutput)
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
      // I think this is where I can get the buffer info.

    }

What is a practical example of using Kiwi's KWCaptureSpy?

I'm having trouble understanding what a practical application of using Kiwi's KWCaptureSpy is. I could do something like this and have it pass:

 __block id successJSON;

  KWCaptureSpy *successBlockSpy =
      [HNKServer captureArgument:@selector(GET:parameters:completion:)
                         atIndex:2];

  [[HNKServer sharedServer] GET:@""
                     parameters:nil
                     completion:^(id JSON, NSError *error) {
                       successJSON = JSON;
                     }];

  HNKServerRequestCallback successBlock = successBlockSpy.argument;
  successBlock(@"JSON", nil);

  [[successJSON shouldEventually] equal:@"JSON"];

but that doesn't seem to actually be testing anything. The example in Kiwi's documentation doesn't help: http://ift.tt/1bycsk2

Has anyone had a good reason to use KWCaptureSpy in practice?

get amount of NSCalendarUnits in NSDateComponents

i want to write a function, which returns specific NSCalendarUnits from a String. My function therefore takes as many NSCalendarUnits as arguments as the user wants to.

public extension String{
    func calendarComponents(units: NSCalendarUnit, separateString: String) -> NSDateComponents{
    // now i want to know which units the user specified 
    // so i can scan the string for integerValues and add them to 
    // the corresponding component
    }
}

as a result i want the user to define a String like var myStringDate = "4/5" and then use the function

 myStringDate.calendarComponents(units: .CalendarUnitMonth |    
 .CalendarUnitWeekday, "/")

and then get a NSDateComponent object with .month = 4 and .weekday = 5.

i do know how to scan the string for all the needed Values, but i don't know, how to add those values to the right component attribute.

Titanium - JHEAD ERROR: can't open image even if image is showing

we are writing an application at school with titanium. My part is to create a gallery. Here's the problem, I want to display images using imageAsThumbnail and passing the thumbnailBlob created to the imageView. Whenever the imageAsThumbnail method is used I get this error even if the images are displaying correctly in both android/ios. The error shows in the log just on android for each image. How can I solve this problem?

[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/1GM_artiglieria.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/3527_primaguerramondiale.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/Trincea_alpina.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/Unknown.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/alpini.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/guerra_epocale_prima_guerra_mondiale_mostra__5__1.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/hqdefault.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/images.jpeg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/mondia46.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/soldato-francese-in-trincea-prima-guerra-mondiale.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/tesina-prima-guerra.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/trinceawwi.jpg'
[ERROR] JHEAD: can't open '/android_asset/Resources/images/1/1/images/trincee-russe-nella-foresta-di-sarikamish_h_partb.jpg'

file.js

exports.ApplicationWindow = function ApplicationWindow(a,b) {

var resourcesDirectory = Titanium.Filesystem.resourcesDirectory;
var imageWidth = 300;

/*TOP MENU*/
var numH=require('db').numH();
var numW=require('db').numW();
var distanzaOttimale=require('db').distanzaOttimale();
var h=require('db').defaultElementsHeight();
/*TOP MENU END*/

var dir = Titanium.Filesystem.getFile(resourcesDirectory + 'images/' + a + '/' + b + '/images');
var images = dir.getDirectoryListing();

dir = null;

var scrollView = Ti.UI.createScrollView({
    backgroundColor: '#fff',
    top:h,
    contentWidth: Ti.UI.FILL,
    contentHeight: 'auto',
    height: 'auto',
    scrollType: 'vertical',
    showHorizontalScrollIndicator: 'false',
    showVerticalScrollIndicator: 'true',
    layout: 'horizontal',
    horizontalWrap: true
});

var left, imageFile, thumbnailBlob, thumbnailImage;

for (var i = 0, l = images.length; i<l; ++i) {

    imageFile = Titanium.Filesystem.getFile(resourcesDirectory, 'images/', a, '/', b, '/images/', images[i]);

    imageBlob = imageFile.read();

    thumbnailBlob = imageBlob.imageAsThumbnail(300,0 ,0); //ERROR

    /* 3- [ 30 ] -2- [ 30 ] -2- [ 30 ] -3 */

    // Ternary operator =  condition ? then : else
    left = (i % 3 === 0) ? 3 : 2;

    thumbnailImage = Ti.UI.createImageView({
        top: 10,
        left: left + '%',
        width: '30%',
        image: thumbnailBlob,
        rowID: i,
        opacity: 0
    });

    thumbnailImage.addEventListener('click', onImageClick);

    scrollView.add(thumbnailImage);

    fadeIn(thumbnailImage, 50 * (i + 3));
}

// Aggiungo un piccolo filler… scusa…
scrollView.add(Ti.UI.createView({
    width: Ti.UI.FILL,
    height: 10
}));

function onImageClick(e) {/*
    Ti.API.info('Row ID : ' + e.source.rowID);*/

    var newWin = Ti.UI.createWindow({
        url: 'display.js',
        backgroundColor: '#000',
        zIndex:1,
        page: e.source.rowID,
        a: a,
        b: b
    });

    newWin.open();
}

function fadeIn(view, delay) {
    setTimeout(function () {
        view.animate({
            opacity: 1,
            duration: 200
        });
    }, delay);
}

return scrollView;
};